Unity Manual

Welcome to Unity.

Unity is made to empower you to create the best interactive entertainment or multimedia experience that you can. This manual is designed to help you learn how to use Unity, from basic to advanced techniques. It can be read from start to finish or used as a reference.

The manual is divided into different sections. The first section, User Guide, is an introduction to Unity's interface, asset workflow, and the basics of building a game. If you are new to Unity, you should start by reading the Unity Basics subsection.

The iOS Guide addresses iOS specific topics such as iOS-specific scripting API, optimizations, and general platform development questions.

The Android Guide addresses Android specific topics such as setting up the Android SDK and general development questions.

The next section, FAQ, is a collection of frequently asked questions about performing common tasks that require a few steps.

The last section, Advanced, addresses topics such as game optimization, shaders, file sizes, and deployment.

When you've finished reading, take a look at the Reference Manual and the Scripting Reference for further details about the different possibilities of constructing your games with Unity.

If you find that any question you have is not answered in this manual please ask on Unity Answers or Unity Forums. You will be able to find your answer there.

Happy reading,
The Unity team

The Unity Manual Guide contains some sections that apply only to certain platforms. Please select which platforms you want to see. Platform-specific information can always be seen by clicking on the disclosure triangles on each page.

Page last updated: 2012-11-16



User Guide

This section of the Manual is focused on the features and functions of Unity. It discusses the interface, core Unity building blocks, asset workflow, and basic gameplay creation. By the time you are done reading the user guide, you will have a solid understanding of how to use Unity to put together an interactive scene and publish it.

We recommend that new users begin by reading the Unity Basics section.

Page last updated: 2010-09-09



Unity Basics

This section is your key to getting started with Unity. It will explain the Unity interface, menu items, using assets, creating scenes, and publishing builds.

When you are finished reading this section, you will understand how Unity works, how to use it effectively, and the steps to put a basic game together.

 Learning the Interface
There is a lot to learn, so take the time you need to observe and understand the interface. We will walk through each interface element together.

 Asset Workflow
Here we'll explain the steps to use a single asset with Unity. These steps are general and are meant only as an overview for basic actions.

 Creating Scenes
Scenes contain the objects of your game. In each Scene, you will place your environments, obstacles, and decorations, designing and building your game in pieces.

 Publishing Builds
At any time while you are creating your game, you might want to see how it looks when you build and run it outside of the editor as a standalone or web player. This section will explain how to access the Build Settings and how to create different builds of your games.

 Tutorials
These online tutorials will let you work with Unity while you follow along, providing hands-on experience with building real projects.

Page last updated: 2010-09-10



Learning the Interface

Unityエディタのインターフェースについての概要を見て、まずはそれに慣れていただきます。メインとなるエディタウィンドウ はいくつかの タブウィンドウビュー(View) と呼ばれます)から構成されています。Unityには様々なタイプのビューがあり、それらは全て以下の項目にあるような特定の目的を持っています。

Page last updated: 2012-11-30



ProjectView40

In this view, you can access and manage the assets that belong to your project.

The left panel of the browser shows the folder structure of the project as a hierarchical list. When a folder is selected from the list by clicking, its contents will be shown in the panel to the right. The individual assets are shown as icons that indicate their type (script, material, sub-folder, etc). The icons can be resized using the slider at the bottom of the panel; they will be replaced by a hierarchical list view if the slider is moved to the extreme left. The space to the left of the slider shows the currently selected item, including a full path to the item if a search is being performed.

Above the project structure list is a Favorites section where you can keep frequently-used items for easy access. You can drag items from the project structure list to the Favourites and also save search queries there (see Searching below).

Just above the panel is a "breadcrumb trail" that shows the path to the folder currently being viewed. The separate elements of the trail can be clicked for easy navigation around the folder hierarchy. When searching, this bar changes to show the area being searched (the root Assets folder, the selected folder or the Asset Store) along with a count of free and paid assets available in the store, separated by a slash. There is an option in the General section of Unity's Preferences window to disable the display of Asset Store hit counts if they are not required.

Along the top edge of the window is the browser's toolbar.

Located at the left side of the toolbar, the Create menu lets you add new assets and sub-folders to the current folder. To its right are a set of tools to allow you to search the assets in your project.

The Window menu provides the option of switching to a one-column version of the project view, essentially just the hierarchical structure list without the icon view. The lock icon next to the menu enables you to "freeze" the current contents of the view (ie, stop them being changed by events elsewhere) in a similar manner to the inspector lock.

Searching

The browser has a powerful search facility that is especially useful for locating assets in large or unfamiliar projects. The basic search will filter assets according to the text typed in the search box

If you type more than one search term then the search is narrowed, so if you type coastal scene it will only find assets with both "coastal" and "scene" in the name (ie, terms are ANDed together).

To the right of the search bar are three buttons. The first allows you to further filter the assets found by the search according to their type.

Continuing to the right, the next button filters assets according to their Label (labels can be set for an asset in the Inspector). Since the number of labels can potentially be very large, the label menu has its own mini-search filter box.

Note that the filters work by adding an extra term in the search text. A term beginning with "t:" filters by the specified asset type, while "l:" filters by label. You can type these terms directly into the search box rather than use the menu if you know what you are looking for. You can search for more than one type or label at once. Adding several types will expand the search to include all specified types (ie, types will be ORed together). Adding multiple labels will narrow the search to items that have all the specified labels (ie, labels are ANDed).

The rightmost button saves the search by adding an item to the Favourites section of the asset list.

Searching the Asset Store

The Project Browser's search can also be applied to assets available from the Unity Asset Store. If you choose Asset Store from the menu in the breadcrumb bar, all free and paid items from the store that match your query will be displayed. Searching by type and label works the same as for a Unity project. The search query words will be checked against the asset name first and then the package name, package label and package description in that order (so an item whose name contains the search terms will be ranked higher than one with the same terms in its package description).

If you select an item from the list, its details will be displayed in the inspector along with the option to purchase and/or download it. Some asset types have previews available in this section so you can, for example, play an audio clip or rotate a 3D model before buying. The inspector also gives the option of viewing the asset in the usual Asset Store window to see additional details.

Shortcuts

The following keyboard shortcuts are available when the browser view has focus. Note that some of them only work when the view is using the two-column layout (you can switch between the one- and two-column layouts using the panel menu in the very top right corner).

FFrame selection
TabShift focus between first column and second column (Two columns)
Ctrl/Cmd + FFocus search field
Ctrl/Cmd + ASelect all visible items in list
Ctrl/Cmd + DDuplicate selected assets
DeleteDelete with dialog
Delete + ShiftDelete without dialog
Backspace + CmdDelete without dialogs (OSX)
EnterBegin rename selected (OSX)
Cmd + down arrowOpen selected assets (OSX)
Cmd + up arrowJump to parent folder (OSX, Two columns)
F2Begin rename selected (Win)
EnterOpen selected assets (Win)
BackspaceJump to parent folder (Win, Two columns)
Right arrowExpand selected item (tree views and search results). If the item is already expanded, this will select its first child item.
Left arrowCollapse selected item (tree views and search results). If the item is already collapsed, this will select its parent item.
Alt + right arrowExpand item when showing assets as previews
Alt + left arrowCollapse item when showing assets as previews

Page last updated: 2012-11-26



Hierarchy


The Hierarchy contains every GameObject in the current Scene. Some of these are direct instances of asset files like 3D models, and others are instances of Prefabs, custom objects that will make up much of your game. You can select objects in the Hierarchy and drag one object onto another to make use of Parenting (see below). As objects are added and removed in the scene, they will appear and disappear from the Hierarchy as well.

Parenting

Unity uses a concept called Parenting. To make any GameObject the child of another, drag the desired child onto the desired parent in the Hierarchy. A child will inherit the movement and rotation of its parent. You can use a parent object's foldout arrow to show or hide its children as necessary.


Two unparented objects

One object parented to another

To learn more about Parenting, please review the Parenting section of the Transform Component page.

Page last updated: 2012-10-18



Toolbar

The Toolbar consists of five basic controls. Each relate to different parts of the Editor.

Transform Tools -- used with the Scene View
Transform Gizmo Toggles -- affect the Scene View display
Play/Pause/Step Buttons -- used with the Game View
Layers Drop-down -- controls which objects are displayed in Scene View
Layout Drop-down -- controls arrangement of all Views

Page last updated: 2012-10-18



SceneView40


The Scene View

The Scene View is your interactive sandbox. You will use the Scene View to select and position environments, the player, the camera, enemies, and all other GameObjects. Maneuvering and manipulating objects within the Scene View are some of the most important functions in Unity, so it's important to be able to do them quickly. To this end, Unity provides keystrokes for the most common operations.

Scene View Navigation

See Scene View Navigation for full details on navigating the scene view. Here's a brief overview of the essentials:

You might also find use in the Hand Tool (shortcut: Q), especially if you are using a one-button mouse. With the Hand tool is selected,

Click-drag to drag the camera around.
Hold Alt and click-drag to orbit the camera around the current pivot point.
Hold Control (Command on Mac) and click-drag to zoom the camera.

In the upper-right corner of the Scene View is the Scene Gizmo. This displays the Scene Camera's current orientation, and allows you to quickly modify the viewing angle.

Each of the coloured "arms" of the gizmo represents a geometric axis. You can click on any of the arms to set the camera to an orthographic (i.e., perspective-free) view looking along the corresponding axis. You can click on the text underneath the gizmo to switch between the normal perspective view and an isometric view. While in isometric mode, you can right-click drag to orbit, and Alt-click drag to pan.

Positioning GameObjects

See Positioning GameObjects for full details on positioning GameObjects in the scene. Here's a brief overview of the essentials:

When building your games, you'll place lots of different objects in your game world. To do this use the Transform Tools in the Toolbar to Translate, Rotate, and Scale individual GameObjects. Each has a corresponding Gizmo that appears around the selected GameObject in the Scene View. You can use the mouse and manipulate any Gizmo axis to alter the Transform Component of the GameObject, or you can type values directly into the number fields of the Transform Component in the Inspector.

Scene View Control Bar


The Scene View control bar lets you see the scene in various view modes - Textured, Wireframe, RGB, Overdraw, and many others. It will also enable you to see (and hear) in-game lighting, game elements, and sound in the Scene View. See View Modes for all the details.

Page last updated: 2012-10-20



GameView40


Game Viewはゲーム内のCameraから見た絵をレンダリングしています。それはあなたが最終的に出力しようとしているゲーム画面です。あなたのゲームをしている時に、プレイヤーが今何を見ているかをコントロールするために あなたは1個以上のCameraを使う必要があります。Cameraの詳細については、Camera Component page をご覧ください

Play Mode

エディター上のツールバーのPlay Modeボタンを押すと、ゲームが再生されます。再生モード中にあなたが行った変更は一時的なもので、再生モードを終了するときにリセットされます。変更が一時的なものだということを忘れないように、エディタのUIは暗くなります。

Game Viewコントロールバー

Game Viewコントロールバーの最初のドロップダウンメニューがAspect ドロップダウンです。ここでは、Game Viewウィンドウのアスペクト比を異なる値に強制することができます。それはあなたのゲームが異なる縦横比のモニターでどのように見えるかテストするために利用できます。

さらに右に行くと Maximize on Play トグルです. 有効になっているときに再生モードに入ると、Game Viewは素晴らしいフルスクリーンプレビューのためのあなたのエディタウィンドウの100%に自分自身を最大化します。

続けて右にあるのが Stats ボタンです。 これは、レンダリングの統計情報 ​​あなたのゲームのグラフィックス性能を監視するために非常に有用であるウィンドウを表示させるものです。( Optimizing Graphics Performance の詳細を参照してください)

最後のボタンは Gizmos トグルです。有効になっている間、シーンビューに表示されるすべてのギズモがGame Viewで描画されます。これはGizmosクラス関数で使われる全てのギズモが含まれます。ギズモボタンを押すとポップアップメニューが出てきて、ゲームで使用される部品の様々な異なる種類のコンポーネントを表示/非表示が可能です。

各コンポーネント名の横にあるアイコンと、それに関連するギズモの設定値は以下のとおりです。アイコンの設定でまた別のポップアップメニューが表示されます。そのポップアップはプリセットアイコンやテクスチャで定義されたカスタムアイコンのどちらかを選ぶことができます。

Gizmo設定で選択した特定のコンポーネントに対してギズモ描画を無効にすることができます。

メニューの一番上の3D Gizmos設定は、ギズモのアイコンについてのものです。設定を有効にすると、アイコンがCameraに対してパースペクティブになり(すなわち、近くのオブジェクトのアイコンは遠くのオブジェクト用のものよりも大きくなります)、逆だと距離に関係なく同じ大きさになります。チェックボックスの横にあるスライダーを使用すると、アイコンのサイズを変化させることができます。これは目に見えるギズモがたくさんあるときに混乱を避けるために便利です。

Page last updated: 2012-11-26



Inspector


Games in Unity are made up of multiple GameObjects that contain meshes, scripts, sounds, or other graphical elements like Lights. The Inspector displays detailed information about your currently selected GameObject, including all attached Components and their properties. Here, you modify the functionality of GameObjects in your scene. You can read more about the GameObject-Component relationship, as it is very important to understand.

Any property that is displayed in the Inspector can be directly modified. Even script variables can be changed without modifying the script itself. You can use the Inspector to change variables at runtime to experiment and find the magic gameplay for your game. In a script, if you define a public variable of an object type (like GameObject or Transform), you can drag and drop a GameObject or Prefab into the Inspector to make the assignment.

Click the question mark beside any Component name in the Inspector to load its Component Reference page. Please view the Component Reference for a complete and detailed guide to all of Unity's Components.


Add Components from the Component menu

You can click the tiny gear icon (or right-click the Component name) to bring up a context menu for the specific Component.

The Inspector will also show any Import Settings for a selected asset file.


Click Apply to reimport your asset.

Use the Layer drop-down to assign a rendering Layer to the GameObject. Use the Tag drop-down to assign a Tag to this GameObject.

Prefabs

If you have a Prefab selected, some additional buttons will be available in the Inspector. For more information about Prefabs, please view the Prefab manual page.

Labels

Unity allows assets to be marked with Labels to make them easier to locate and categorise. The bottom item on the inspector is the Asset Labels panel.

Attach:InspectorLabelsPanelEmpty.png Δ

At the bottom right of this panel is a button titled with an ellipsis ("...") character. Clicking this button will bring up a menu of available labels

Attach:InspectorLabelsMenu.png Δ

You can select one or more items from the labels menu to mark the asset with those labels (they will also appear in the Labels panel). If you click a second time on one of the active labels, it will be removed from the asset.

Attach:InspectorAssetMenuWithLabels.png Δ

The menu also has a text box that you can use to specify a search filter for the labels in the menu. If you type a label name that does not yet exist and press return/enter, the new label will be added to the list and applied to the selected asset. If you remove a custom label from all assets in the project, it will disappear from the list.

Once you have applied labels to your assets, you can use them to refine searches in the Project Browser (see this page for further details). You can also access an asset's labels from an editor script using the AssetDatabase class.

Page last updated: 2012-11-26



Other Views

The Views described on this page covers the basics of the interface in Unity. The other Views in Unity are described elsewhere on separate pages:

Page last updated: 2012-11-28



Customizing Your Workspace

ワークスペースのカスタマイズ

いずれかのビューのタブをいくつかの場所にクリック & ドラッグして、ビューの Layout をカスタマイズできます。 既存のウィンドウの Tab Area にタブをドロップすると、既存のタブの脇にタブが追加されます。 また、Dock Zone にタブをドロップすることで、新しいウィンドウにビューを追加します。


「ビューは、既存のウィンドウの両側または下部にドッキングできます」

タブは、メイン エディタ ウィンドウから切り離して、フローティング エディタ ウィンドウに配列できます。 フローティング ウィンドウは、メイン エディタ ウィンドウ同様、ビューとタブの配列を含むことができます。


「フローティング エディタ ウィンドウは、ツールバーがない点を除いては、メイン エディタ ウィンドウを同じです」

エディタ ウィンドウのレイアウトを作成している場合は、レイアウトを保存し、いつでも復旧できます。 これを行うには、レイアウト ドロップダウン (ツールバー上) を展開し、Save Layout... を選択します。 新しいレイアウトに名前を付けて保存し、レイアウト ドロップダウンから選択するだけで復旧できます。


「完全にカスタムのレイアウト」

いつでも、ビューのタブを右クリックして、最大化などのオプションを表示したり、同じウィンドウに新しいタブを追加したりできます。

Page last updated: 2012-11-09



Asset Workflow

Here we'll explain the steps to use a single asset with Unity. These steps are general and are meant only as an overview for basic actions. For the example, we'll talk about using a 3D mesh.

Create Rough Asset

Use any supported 3D modeling package to create a rough version of your asset. Our example will use Maya. Work with the asset until you are ready to save. For a list of applications that are supported by Unity, please see this page.

Import

When you save your asset initially, you should save it normally to the Assets folder in your Project folder. When you open the Unity project, the asset will be detected and imported into the project. When you look in the Project View, you'll see the asset located there, right where you saved it. Please note that Unity uses the FBX exporter provided by your modeling package to convert your models to the FBX file format. You will need to have the FBX exporter of your modeling package available for Unity to use. Alternatively, you can directly export as FBX from your application and save in the Projects folder. For a list of applications that are supported by Unity, please see this page.

Import Settings

If you select the asset in the Project View the import settings for this asset will appear in the Inspector. The options that are displayed will change based on the type of asset that is selected.

Adding Asset to the Scene

Simply click and drag the mesh from the Project View to the Hierarchy or Scene View to add it to the Scene. When you drag a mesh to the scene, you are creating a GameObject that has a Mesh Renderer Component. If you are working with a texture or a sound file, you will have to add it to a GameObject that already exists in the Scene or Project.

Putting Different Assets Together

Here is a brief description of the relationships between the most common assets

Creating a Prefab

Prefabs are a collection of GameObjects & Components that can be re-used in your scenes. Several identical objects can be created from a single Prefab, called instancing. Take trees for example. Creating a tree Prefab will allow you to instance several identical trees and place them in your scene. Because the trees are all linked to the Prefab, any changes that are made to the Prefab will automatically be applied to all tree instances. So if you want to change the mesh, material, or anything else, you just make the change once in the Prefab and all the other trees inherit the change. You can also make changes to an instance, and choose GameObject->Apply Changes to Prefab from the main menu. This can save you lots of time during setup and updating of assets.

When you have a GameObject that contains multiple Components and a hierarchy of child GameObjects, you can make a Prefab of the top-level GameObject (or root), and re-use the entire collection of GameObjects.

Think of a Prefab as a blueprint for a structure of GameObjects. All the Prefab clones are identical to the blueprint. Therefore, if the blueprint is updated, so are all the clones. There are different ways you can update the Prefab itself by changing one of its clones and applying those changes to the blueprint. To read more about using and updating Prefabs, please view the Prefabs page.

To actually create a Prefab from a GameObject in your scene, simply drag the GameObject from the scene into the project, and you should see the Game Object's name text turn blue. Name the new Prefab whatever you like. You have now created a re-usable prefab.

Updating Assets

You have imported, instantiated, and linked your asset to a Prefab. Now when you want to edit your source asset, just double-click it from the Project View. The appropriate application will launch, and you can make any changes you want. When you're done updating it, just Save it. Then, when you switch back to Unity, the update will be detected, and the asset will be re-imported. The asset's link to the Prefab will also be maintained. So the effect you will see is that your Prefab will update. That's all you have to know to update assets. Just open it and save!

Optional - Adding Labels to the Assets.

Is always a good idea to add labels to your assets if you want to keep organized all your assets, with this you can search for the labels associated to each asset in the search field in the project view or in the object selector.

Steps for adding a label to an asset:

Notes:

Page last updated: 2012-09-15



Creating Scenes

Scenes には、ゲームのオブジェクトが含まれています。 シーンを使用して、メイン メニュー、個々のレベルなどを作成できます。 1 つのシーン ファイルは 1 つのレベルとして考えてください。 各シーンでは、環境や障害物、装飾を配置し、主にゲームを細かく設計および作成します。

プレハブのインスタンス化

最後の項で説明する方法で Prefab を作成します。 プレハブの詳細については、here を参照してください。 プレハブを作成したら、Instance と呼ばれるプレハブのコピーを素早く、簡単に作成できます。 プレハブのインスタンスを作成するには、Project View から、Hierarchy または Scene View にプレハブをドラッグします。 今度は、プレハブの 1 つだけのインスタンスを自由に配置し、微調整します。

コンポーネントとスクリプトの追加

プレハブや GameObject を強調表示すると、Components を使用して、機能をさらに追加できます。 各種コンポーネントの詳細については、Component Reference を参照してください。 Scripts は、コンポーネントの一種です。 コンポーネントを追加するには、GameObject を強調表示し、Component メニューからコンポーネントを選択します。 コンポーネントが GameComponent の Inspector に表示されます。 スクリプトは、デフォルトで、Component メニューにも含まれます。

コンポーネントの追加により、そのプレハブへの GameObject の接続が壊れる場合は、リンクを再構築するため、常にメニューから GameObject->Apply Changes to Prefab を使用できます。

GameObject の設置

GameObject をシーン内に置くと、Transform Tools を使用して、好きな時に配置できます。 さらに、インスペクタで「Transform」値を使用して、配置と回転を微調整できます。 GameObject の配置および回転の詳細については、Transform Component page ページを参照してください。

カメラの取り扱い

Cameras はゲームの眼です。 プレイ時にプレイヤーが目にするものはすべて、1 つまたは複数のカメラを通じて得られます。 その他の GameObject 同様、カメラは配置、回転、親子関係の構築を行うことができます。 カメラは、カメラ 今後が追加されたカメラ コンポーネントのある GameObject にすぎません。 そのため、通常の GameObject ができることの他、カメラ固有の機能も行うことができます。 新しいプロジェクト作成時に標準のアセットと共にインストールされる便利なカメラ スクリプトがあります。 これは、メニューから Components->Camera-Control を選択すると使用できます。 理解すべきカメラに追加される側面が幾つかあります。 カメラについては、Camera Component reference を参照してください。

ライト

非常に稀なケースを除いて、シーンに必ず Lights を追加する必要があります。 ライトには 3 種類あり、そのすべての動作がそれぞれ若干異なります。 重要なことは、ライトは、ゲームにムードや雰囲気を加えることができます。 照明が変わると、ゲームのムードも完全に代わり、ライトの活用は学ぶべき重要な主題になります。 カメラについては、Camera Component reference を参照してください。

Page last updated: 2012-11-09



Publishing Builds

ゲーム作成中、ゲームを作成し、スタンドアロンまたはウェブ プレイヤーとして、エディタの外部で実行する際、常時どのように見えるかを確認したい場合があるでしょう。 本項では、Build Settings へのアクセス方法およびゲームの各種ビルドの作成方法について説明します。

File->Build Settings... は、ビルド設定ウィンドウにアクセスするためのメニュー項目です。 ゲーム作成時に含まれるシーンの編集可能なリストがポップアップ表示されます。


ビルド設定ウィンドウ

最初にプロジェクトでこのウィンドウを開くと、空に見えます。 このリストが空のままで、ゲームを作成すると、現在開いているシーンのみがビルドに含まれます。 1 つのシーン ファイルだけでテスト プレイヤーを素早く作成したい場合は、空のシーン リストでプレイヤーを作成します。

マルチシーンビルドでこのリストにシーン ファイルを追加するのは簡単です。 次の 2 種類の追加方法があります。 まず、Add Current ボタンをクリックします。 リストに現在開いているシーンが表示されます。 次に、シーン ファイルを追加するには、 Project View からリストにファイルをドラッグします。

この時点までに、シーンによってインデックス値が異なります。 Scene 0 は、ゲーム ビルド時にロードサれる最初のシーンです。 新しいシーンをロードするには、スクリプト内で Application.LoadLevel() を使用します。

複数のシーン ファイルを追加し、最配置したい場合、希望の順序になるまで、上記のリストまたは下記のその他でシーンをクリックし、ドラッグするだけです。

リストからシーンを削除したい場合は、クリックしてシーンを強調表示し、Command-Delete を押します。 リストからシーンが消え、ビルドに含まれなくなります。

ビルドをパブリッシュする準備ができたら、Platform を選択し、Unity のロゴがプラットフォームの隣にあるか確認します。ない場合は、Switch Platform ボタンをクリックして、Unity に作成したいプラットフォームを知らせます。 最後に、Build ボタンを押します。 標準の保存ダイアログを使用して、ゲームに名前と位置を選択できるようになります。 Save をクリックすると、Unity がゲームをすぐに作成します。 これは非常にシンプルです。 作成したゲームをどこに保存すべきか分からない場合は、プロジェクトのルート フォルダに保存することを検討してください。 Assets フォルダにはビルドを保存できません。

スタンドアロン プレイヤーで Debug build チェックボックスを有効にすると、Profiler 機能が有効になります。 また、プレイヤーはデバッグ記号で作成されるため、サード パーティプロファイリングまたはデバッギング ツールを使用できます。 Enabling the Development Build checkbox on a player will enable Profiler functionality and also make the Autoconnect Profiler and Script Debugging options available.

Desktop

ウェブプレイヤーのストリーミング

ストリーミング ウェブ プレイヤーにより、シーン 0 がロードを終了するとすぐに、ウェブ プレイヤー ゲームがプレイを開始できるようになります。 10 のレベルのあるゲームの場合、レベル 1 のプレイを開始する前に、プレイヤーに待機させ、レベル 2 〜 10 にすべてのアセットをダウンロードサせる意味はありません。ストリーミング ウェブ プレイヤーのパブリッシュ時に、ダウンロード剃る必要のあるアセットが、表示される Scene ファイルの順に並べ替えられます。シーン 0 内のすべてのアセットのダウンロードが終了すると、ウェブ プレイヤーはプレイを開始します。

簡単に言うと、ストリーミング ウェブ プレイヤーにより、プレイヤーはこれまでよりも速くゲームをプレイできます。

唯一心配すべきことは、ロードする前に、ロードしたい次のレベルのストリーミングが終了したか確認することです。

通常、非ストリーム プレイヤーでは、次のコードを使用して、レベルをロードします。

Application.LoadLevel("levelName");

通常、ストリーミング ウェブ プレイヤーでは、最初にレベルのストリーミングが終了したかを確認する必要があります。 これは、CanStreamedLevelBeLoaded() 関数を使用して行われます。 これは次のように機能します。

var levelToLoad = 1;

function LoadNewLevel () {
	if (Application.CanStreamedLevelBeLoaded (levelToLoad)) {
		Application.LoadLevel (levelToLoad);
	}
}

ローディング バーやその他の表示に対して、プレイヤーにレベルのストリーミングの進捗を表示したい場合、GetStreamProgressForLevel() にアクセスして進捗を読み取ることができます。

オフライン ウェブ プレイヤーの配備

Offline Deployment オプションがウェブ プレイヤーに対して有効になると、作成中、UnityObject.js ファイル (プレイヤーとホスト ページとを結びつけるのに使用されます) が、 プレイヤーと平行して配置されます。 これにより、ネットワークに接続していなくても、プレイヤーはローカルのスクリプト ファイルで作業できます。通常、UnityObject.js は、最新バージョンが使用できる使用できるようにするため、Unity のウェブサーバーからダウンロードされます。

スタンドアロン プレイヤーの作成

Unity では、Windows および Mac 向けのスタンドアロン アプリケーションを作成できます (両方のアーキテクチャで実行する Intel、PowerPC または Universal)。 ビルド設定ダイアログでビルド対象を選択し、Buildボタンを押すのは簡単です。 スタンドアロン プレイヤー作成時に、結果生じるファイルは、ビルド対象によって変わります。 Windows では、アプリケーションのすべてのリソースを含む Data フォルダと共に、実行可能なファイル (.exe) が作成されます。 Mac では、リソースの他、アプリケーションの実行に必要なファイルを含む、app bundle が作成されます。

Mac でのスタンドアロンの配布は、app bundle を配布するだけです (すべてがそこに含まれています)。 Windows では、.exe ファイルと他がそれを実行するための Data フォルダの両方を提供する必要があります。 これは、以下のように考えてください。 その他の人々は、ゲームを実行するために、Unity が作成するファイルとして、コンピュータ上に同じファイルを持つ必要があります。

ビルド処理内

作成処理は、指定した任意の場所に作成したゲーム アプリケーションの空のコピーを置きます。 次に、これがビルド設定内のシーン リストを処理し、エディタで一度に 1 つ開いて、最適化し、アプリケーション パッケージの統合します。 また、含まれているシーンが必要とするすべてのアセットを計算し、アプリケーション パッケージ内に個々のファイルを格納します。

  • EditorOnlyというタグのついたシーン内の GameObject は、パブリッシュされたビルドには含まれません。 これは、最終的なゲームに含める必要のない、スクリプトのデバッグに便利です。
  • 新しいレベルのロード時、前のレベルの全オブジェクトが破棄されます。 これを防ぐには、破棄したくないオブジェクトで、 DontDestroyOnLoad() を使用します。 これは、レベルのロード時に音楽の再生を続ける場合や、ゲームの状態や進捗を維持するゲーム コントローラ スクリプトに通常使用されます。
  • 新しいレベルのロード終了後に、 OnLevelWasLoaded() というメッセージがすべてのアクティブなゲーム オブジェクトに送信されました。
  • 複数のシーン (例えば、メイン メニュー、ハイスコア画面、実際のゲーム レベルなど) で最適にゲームを作成する方法の詳細については、Scripting Tutorial.pdf を参照してください。

iOS

iOS ビルド処理内

iPhone/iPad アプリケーション ビルド処理には、次の 2 つの手順があります。

  1. XCode プロジェクトが必要なすべてのライブラリ、プレコンパイルされた .NET コードおよび直列化されたアセットと共に生成されます。
  2. XCode プロジェクトが作成され、実機に配備されます。

ビルド設定ダイアログで作成を押すと、最初の手順のみ達成されます。 ディタで作成実行を押すと、両方の手順が行われます。 プロジェクト保存ダイアログで、ユーザーが既存のフォルダを選択すると、警告が表示されます。 現在、次の 2 つの XCode プロジェクト生成モードを選択できます。

  • replace - 対象フォルダからのファイルはすべて削除され、新しい内容が作成されます。
  • append - DataLibrariesおよびプロジェクト のルート フォルダが一掃され、新たに生成された内容で満たされます。 XCode プロジェクト ファイルは、最新の Unity のプロジェクト 変更に応じて更新されます。 XCode プロジェクトのClassesサブフォルダは、カスタムのネイティブ コードを置く安全な場所として考えることができますが、通常のバックアップを作成することをお勧めします。 Append モードは、同じ Unity iOS で生成された既存の XCode プロジェクトに対してのみサポートされています。

Cmd+B を押すと、自動作成および実行処理が呼び出され、最新の使用されたフォルダがビルド対象とみなされます。 この場合、appendモードはデフォルトとみなされます。

Android

Android アプリケーション ビルド処理には、次の 2 つの手順があります。

  1. アプリケーション パッケージ (.apk ファイル) が必要なすべてのライブラリ、プレコンパイルされた .NET コードおよび直列化されたアセットと共に生成されます。
  2. アプリケーション パッケージが作成され、実機に配備されます。

ビルド設定ダイアログで作成を押すと、最初の手順のみ達成されます。 ディタで作成実行を押すと、両方の手順が行われます。 Cmd+B を押すと、自動作成および実行処理が呼び出され、最新の使用されたフォルダがビルド対象とみなされます。

最初に Android プロジェクトを作成する際に、Unity によって Android SDK を置くように求められます。これは、機器に Android アプリケーションを作成し、インストールするのに必要になります。 この設定は、Preferences で後で変更できます。

Android にアプリケーションを作成する際、機器設定で、USB DebuggingおよびAllow mock locationsチェックボックスにチェックが入っていることを確認してください。

Android SDK/platform-tools フォルダにある adb devices コマンドを実行することで、OS は機器を確認できます。 これは、Windows と Mac プロジェクト ファイルの両方に機能します。

Unity は、アプリケーション アーカイブ (.apk ファイル) を作成し、接続された機器にインストールします。 アプリケーションが、iPhone 上などで自動起動できない場合があるため、画面のロックを解除する必要があります。新たにインストールされたアプリケーションがメニューに表示される場合も稀にあります。

テクスチャ圧縮

Build Settings 下に、Texture Compression オプションがあります。 デフォルトでは、Unity は、個々のテクスチャ形式の無効のないテクスチャに対しては、[ Main.android-GettingStarted | ETC1/RGBA16 ]] テクスチャ形式を使用します (Texture 2D / Per-Platform Overrides 参照)。

特定のハードウェア アーキテクチャを対象に、アプリケーション アーカイブ (.apk ファイル) を作成したい場合、Texture Compression オプションを使用して、このデフォルトの動作を無効にできます。 圧縮されていないテクスチャはそのままにされます。つまり、圧縮フォーマットのテクスチャだけ Texture Compression オプションで選択されたフォーマットを使います。

アプリケーションが選択したテクスチャ圧縮をサポートしている機器でのみに配備サせたい場合、Unity は、AndroidManifest tを編集して、選択した特定の形式に一致するタグを含めることができます。 これにより、Android Market のフィルタリング機構が、適切なグラフィック ハードウェアを搭載した機器にのみアプリケーションを提供します。

プレロード

パブリッシュされたビルドは、シーンのロード時にシーン内のすべてのアセットを自動的にプレロードします。 このルールの例外は、シーン 0 です。これは、最初のシーンが通常、できるだけ速く表示したいであろうスプラッシュ画面であるためです。

すべての内容がプレロードされるようにするため、Application.LoadLevel(1)を呼び出す空のシーンを作成できます。 ビルド設定で、この空のシーンのインデックスを 0 にします。続くレベルはすべてプレロードされます。

ゲーム作成の準備ができました。

これまで、Unity のインターフェースの使い方、アセットの使い方、シーンの作成法およびビルドのパブリッシュ法を学んできました。 理想のゲーム作成を邪魔するものはありません。 途中更に多くのことを学ぶことになりますが、ここではそのお手伝いをします。

Unity 自体の使用の詳細については、continue reading the manual するか、Tutorials に従ってください。

ゲームの動作の基本であるコンポーネントの詳細については、Component Reference を参照してください。

スクリプティングの詳細については、Scripting Reference を参照してください。

アート アセットの作成の詳細については、本マニュアルの Assets section を参照してください。

Unity ユーザーや開発者のコミュニティに参加したい場合は、Unity Forums にアクセスしてください。 ここでは、質問やプロジェクトの共有、チームの作成などを行えます。あなたの作成した素晴らしいゲームを見たいので、ぜひ 1 度はフォーラムに参加してください。

Page last updated: 2012-11-13



Tutorials

これらのチュートリアルに従いながら、Unity を使用してください。 実際に体験することで、実際のプロジェクトを作成することができます。 新規ユーザーの場合は、GUI の基本とスクリプティングの基本チュートリアルに その後は、いずれかのチュートリアルに従ってください。 チュートリアルはすべて PDF 形式なので、印刷して、Unity と並行で、指示を仰いだり、参照したりできます。

注意: これらのチュートリアルは、Unity のデスクトップ版向けであるため、Android や iOS 機器 (iPhone/iPad) では機能しません。

また、Unity 向けのプレゼンテーション、アーティクル、アセットまたは拡張などのその他のリソースをお探しの場合は、here から。

また、 Unity3D Tutorial's Home Page をクリックすることで、チュートリアルの最新の追加内容を確認できます。

Page last updated: 2012-11-09



Unity Hotkeys

このページでは、Unityのデフォルトショートカットキーの概要を説明します。PDFで一覧を確認したい場合は、Windows and MacOSXらダウンロードできます。。キーの一部に”CTRL/CMD"とある場合は、WindowsではばControlキーをMacOSXならばCommandキーを使用します。

Tool
入力キーコマンド
Qパン
W移動
E回転
R拡大/縮小
Zギズモの表示位置を切り替え
Xギズモの回転設定を切り替え
V頂点スナッピング
CTRL/CMD+マウス左ボタンスナッピング
 
GameObject
CTRL/CMD+SHIFT+Ngame objectを生成する
CTRL/CMD+ALT+Fビューに移動(Move to view)
CTRL/CMD+SHIFT+Fビューに位置合わせ(Aligh with view)
 
Window
CTRL/CMD+1Scene
CTRL/CMD+2Game
CTRL/CMD+3Inspector
CTRL/CMD+4Hierarchy
CTRL/CMD+5Project
CTRL/CMD+6Animation
CTRL/CMD+7Profiler
CTRL/CMD+9Asset store
CTRL/CMD+0Animation
CTRL/CMD+SHIFT+CConsole
 
Edit
CTRL/CMD+Z取り消す(Undo)
CTRL+Y (Windows only)やり直す(Redo)
CMD+SHIFT+Z (Mac only)やり直す(Redo)
CTRL/CMD+Xカット
CTRL/CMD+Cコピー
CTRL/CMD+V貼り付け
CTRL/CMD+D複製(Duplicate)
SHIFT+Del削除
F選択項目をフレームの中央に(Frame(center)selection)
CTRL/CMD+F検索
CTRL/CMD+A全ての項目を選択
 
Selection
CTRL/CMD+SHIFT+11から選択状況をロード(Load Selection 1)
CTRL/CMD+SHIFT+22から選択状況をロード(Load Selection 2)
CTRL/CMD+SHIFT+33から選択状況をロード(Load Selection 3)
CTRL/CMD+SHIFT+44から選択状況をロード(Load Selection 4)
CTRL/CMD+SHIFT+55から選択状況をロード(Load Selection 5)
CTRL/CMD+SHIFT+66から選択状況をロード(Load Selection 6)
CTRL/CMD+SHIFT+77から選択状況をロード(Load Selection 7)
CTRL/CMD+SHIFT+88から選択状況をロード(Load Selection 8)
CTRL/CMD+SHIFT+99から選択状況をロード(Load Selection 9)
CTRL/CMD+ALT+11に選択状況をセーブ(Save Selection 1)
CTRL/CMD+ALT+22に選択状況をセーブ(Save Selection 2)
CTRL/CMD+ALT+33に選択状況をセーブ(Save Selection 3)
CTRL/CMD+ALT+44に選択状況をセーブ(Save Selection 4)
CTRL/CMD+ALT+55に選択状況をセーブ(Save Selection 5)
CTRL/CMD+ALT+66に選択状況をセーブ(Save Selection 6)
CTRL/CMD+ALT+77に選択状況をセーブ(Save Selection 7)
CTRL/CMD+ALT+88に選択状況をセーブ(Save Selection 8)
CTRL/CMD+ALT+99に選択状況をセーブ(Save Selection 9)
 
Assets
CTRL/CMD+Rリフレッシュ

Page last updated: 2012-11-28



Preferences

Unity provides a number of preference panels to allow you to customise the behaviour of the editor.

General

Auto RefreshShould the editor update assets automatically as they change?
Always Show Project WizardShould the project wizard be shown at startup? (By default, it is shown only when the alt key is held down during launch)
Compress Assets On ImportShould assets be compressed automatically during import?
OSX Color PickerShould the native OSX color picker be used instead of Unity's own?
Editor AnalyticsCan the editor send information back to Unity automatically?
Show Asset Store search hitsShould the number of free/paid assets from the store be shown in the Project Browser?
Verify Saving AssetsShould Unity verify which assets to save individually on quitting?
Skin (Pro Only)Which color scheme should Unity use for the editor? Pro users have the option of dark grey in addition to the default light grey.
Graphics DeviceThis is set to Automatic on the Mac but has options for Direct3D 9, Direct3D 11 and OpenGL on Windows.

External Tools

External Script EditorWhich application should Unity use to open script files?
Editor AttachingShould Unity allow debugging to be controlled from the external script editor?
Image ApplicationWhich application should Unity use to open image files?
Asset Server Diff ToolWhich application should Unity use to resolve file differences with the asset server?
Android SDK LocationWhere in the filesystem is the Android SDK folder located?
iOS Xcode 4.x supportShould support for Xcode 4.x be enabled for iOS build targets?

Colors

This panel allows you to choose the colors that Unity uses when displaying various user interface elements.

Keys

This panel allows you to set the keystrokes that activate the various commands in Unity.

Cache Server

Use Cache ServerShould the cache server be enabled?
IP AddressIP address of the cache server, if enabled

Page last updated: 2012-10-26



Building Scenes

This section will explain the core elements you will work with to build scenes for complete games.

Page last updated: 2007-11-16



GameObjects

GameObjects は、Unity で最も重要なオブジェクトです。 GameObject とは何か、どのように使用できるかを理解することが非常に重要です。 このページでは、そのすべてについて説明します。

GameObject とは何ですか?

ゲーム内のすべてのオブジェクトはすべて本質的に、GameObject になります。 しかし、GameObject は自身では何もしません。 キャラクターや環境、特殊効果になるには、特殊なプロパティが必要です。 これらのオブジェクトはすべて動作が異なります。 すべてのオブジェクトが GameOnject の場合、相互作用パワーアップオブジェクトをスタティック ルームとどのように区別するでしょうか? これらの GameObject を互いに差別化するものは何でしょうか?

この質問の回答は、GameObject は容器です。 これらは空の箱で、ライトマップされた孤島または物理特性駆動の車両を構成する各種ピースを格納できる空の箱です。 そのため、GameObject を本当に理解するには、これらのピースを理解する必要があります。これらのピースは Components と呼ばれます。 作成したいオブジェクトの種類に応じて、異なる組み合わせのコンポーネントを GameObject に追加します。 GameObject は、空の鍋、コンポーネントはゲームプレイというレシピを構成する各種材料と考えてください。 スクリプトを使用して、自身でコンポーネントを作成することもできます。

本項のページでは、GameObject、コンポーネントおよびスクリプト コンポーネントについて記載されています。

Page last updated: 2012-11-13



The GameObject-Component Relationship

As described previously in GameObjects, a GameObject contains Components. We'll explore this relationship by discussing a GameObject and its most common Component -- the Transform Component. With any Unity Scene open, create a new GameObject (using Shift-Control-N on Windows or Shift-Command-N on Mac), select it and take a look at the Inspector.


The Inspector of an Empty GameObject

Notice that an empty GameObject still contains a Name, a Tag, and a Layer. Every GameObject also contains a Transform Component.

The Transform Component

It is impossible to create a GameObject in Unity without a Transform Component. The Transform Component is one of the most important Components, since all of the GameObject's Transform properties are enabled by its use of this Component. It defines the GameObject's position, rotation, and scale in the game world/Scene View. If a GameObject did not have a Transform Component, it would be nothing more than some information in the computer's memory. It effectively would not exist in the world.

The Transform Component also enables a concept called Parenting, which is utilized through the Unity Editor and is a critical part of working with GameObjects. To learn more about the Transform Component and Parenting, read the Transform Component Reference page.

Other Components

The Transform Component is critical to all GameObjects, so each GameObject has one. But GameObjects can contain other Components as well.


The Main Camera, added to each scene by default

Looking at the Main Camera GameObject, you can see that it contains a different collection of Components. Specifically, a Camera Component, a GUILayer, a Flare Layer, and an Audio Listener. All of these Components provide additional functionality to the GameObject. Without them, there would be nothing rendering the graphics of the game for the person playing! Rigidbodies, Colliders, Particles, and Audio are all different Components (or combinations of Components) that can be added to any given GameObject.

Page last updated: 2012-08-13



Using Components40

Components are the nuts & bolts of objects and behaviors in a game. They are the functional pieces of every GameObject. If you don't yet understand the relationship between Components and GameObjects, read the GameObjects page before going any further.

A GameObject is a container for many different Components. By default, all GameObjects automatically have a Transform Component. This is because the Transform dictates where the GameObject is located, and how it is rotated and scaled. Without a Transform Component, the GameObject wouldn't have a location in the world. Try creating an empty GameObject now as an example. Click the GameObject->Create Empty menu item. Select the new GameObject, and look at the Inspector.

Even empty GameObjects have a Transform Component

Remember that you can always use the Inspector to see which Components are attached to the selected GameObject. As Components are added and removed, the Inspector will always show you which ones are currently attached. You will use the Inspector to change all the properties of any Component (including scripts)

Adding Components

You can add Components to the selected GameObject through the Components menu. We'll try this now by adding a Rigidbody to the empty GameObject we just created. Select it and choose Component->Physics->Rigidbody from the menu. When you do, you will see the Rigidbody's properties appear in the Inspector. If you press Play while the empty GameObject is still selected, you might get a little surprise. Try it and notice how the Rigidbody has added functionality to the otherwise empty GameObject. (The y-component of the GameObject starts to decrease. This is because the physics engine in Unity is causing the GameObject to fall under gravity.)

An empty GameObject with a Rigidbody Component attached

Another option is to use the Component Browser, which can be activated with the Add Component button in the object's inspector.

The browser lets you navigate the components conveniently by category and also has a search box that you can use to locate components by name.

You can attach any number or combination of Components to a single GameObject. Some Components work best in combination with others. For example, the Rigidbody works with any Collider. The Rigidbody controls the Transform through the NVIDIA PhysX physics engine, and the Collider allows the Rigidbody to collide and interact with other Colliders.

If you want to know more about using a particular Component, you can read about any of them in the Component Reference. You can also access the reference page for a Component from Unity by clicking on the small ? on the Component's header in the Inspector.

Editing Components

One of the great aspects of Components is flexibility. When you attach a Component to a GameObject, there are different values or Properties in the Component that can be adjusted in the editor while building a game, or by scripts when running the game. There are two main types of Properties: Values and References.

Look at the image below. It is an empty GameObject with an Audio Source Component. All the values of the Audio Source in the Inspector are the default values.

This Component contains a single Reference property, and seven Value properties. Audio Clip is the Reference property. When this Audio Source begins playing, it will attempt to play the audio file that is referenced in the Audio Clip property. If no reference is made, an error will occur because there is no audio to be played. You must reference the file within the Inspector. This is as easy as dragging an audio file from the Project View onto the Reference Property or using the Object Selector.

Now a sound effect file is referenced in the Audio Clip property

Components can include references to any other type of Component, GameObjects, or Assets. You can read more about assigning references on the Assigning References page.

The remaining properties on the Audio Clip are all Value properties. These can be adjusted directly in the Inspector. The Value properties on the Audio Clip are all toggles, numeric values, drop-down fields, but value properties can also be text strings, colors, curves, and other types. You can read more about these and about editing value properties on the Editing Value Properties page.

Copying and pasting Component settings

The context menu for a Component has items for copying and pasting its settings.

The copied values can be pasted to an existing component using the Paste Component Values menu item. Alternatively, you can use Paste Component As New to create a new Component with those values.

Testing out Properties

While your game is in Play Mode, you are free to change properties in any GameObject's Inspector. For example, you might want to experiment with different heights of jumping. If you create a Jump Height property in a script, you can enter Play Mode, change the value, and press the jump button to see what happens. Then without exiting Play Mode you can change it again and see the results within seconds. When you exit Play Mode, your properties will revert to their pre-Play Mode values, so you don't lose any work. This workflow gives you incredible power to experiment, adjust, and refine your gameplay without investing a lot of time in iteration cycles. Try it out with any property in Play Mode. We think you'll be impressed.

Changing the order of Components

The order in which components are listed in the Inspector doesn't matter in most cases. However, there are some Components, such as Image Effects where the ordering is significant. The context menu has Move Up and Move Down commands to let you reorder Components as necessary.

Removing Components

If you want to remove a Component, option- or right-click on its header in the Inspector, and choose Remove Component. Or you can left-click the options icon next to the ? on the Component header. All the property values will be lost and this cannot be undone, so be completely sure you want to remove the Component before you do.

Page last updated: 2012-09-12



The Component-Script Relationship

script 作成後、GameObject にこのスクリプトを追加すると、コンポーネントのような、GameObject のインスペクタでに表示されます。 これは、スクリプトが保存時に、コンポーネントになるためです。スクリプトは一種のコンポーネントです。 技術的な意味では、スクリプトはコンポーネントの一種としてコンパイルし、Unity エンジンによってその他のコンポーネント同様に微調整されます。 そのため、基本的に、スクリプトは、自身で作成するコンポーネントです。 インスペクタで表示するそのメンバーを定義すると、記述したすべての機能を実行します。

スクリプトの作成および使用の詳細については、Scripting ページを参照してください。

Page last updated: 2012-11-09



DeactivatingGameObjects

A GameObject can be temporarily removed from the scene by marking it as inactive. This can be done using its activeSelf property from a script or with the activation checkbox in the inspector

A GameObject's activation checkbox

Effect of deactivating a parent GameObject

When a parent object is deactivated, the deactivation also overrides the activeSelf setting on all its child objects, so the whole hierarchy from the parent down is made inactive. Note that this does not change the value of the activeSelf property on the child objects, so they will return to their original state once the parent is reactivated. This means that you can't determine whether or not a child object is currently active in the scene by reading its activeSelf property. Instead, you should use the activeInHierarchy property, which takes the overriding effect of the parent into account.

This overriding behaviour was introduced in Unity 4.0. In earlier versions, there was a function called SetActiveRecursively which could be used to activate or deactivate the children of a given parent object. However, this function worked differently in that the activation setting of each child object was changed - the whole hierarchy could be switched off and on but the child objects had no way to "remember" the state they were originally in. To avoid breaking legacy code, SetActiveRecursively has been kept in the API for 4.0 but its use is not recommended and it may be removed in the future. In the unusual case where you actually want the children's activeSelf settings to be changed, you can use code like the following:-

// JavaScript
function DeactivateChildren(g: GameObject, a: boolean) {
	g.activeSelf = a;

	for (var child: Transform in g.transform) {
		DeactivateChildren(child.gameObject, a);
	}
}


// C#
void DeactivateChildren(GameObject g, bool a) {
	g.activeSelf = a;

	foreach (Transform child in g.transform) {
		DeactivateChildren(child.gameObject, a);
	}
}

Page last updated: 2012-10-05



Using The Inspector

The Inspector is used to view and edit Properties of many different types.

Games in Unity are made up of multiple GameObjects that contain meshes, scripts, sounds, or other graphical elements like Lights. When you select a GameObject in the Hierarchy or Scene View, the Inspector will show and let you modify the Properties of that GameObject and all the Components and Materials on it. The same will happen if you select a Prefab in the Project View. This way you modify the functionality of GameObjects in your game. You can read more about the GameObject-Component relationship, as it is very important to understand.


Inspector shows the properties of a GameObject and the Components and Materials on it.

When you create a script yourself, which works as a custom Component type, the member variables of that script are also exposed as Properties that can be edited directly in the Inspector when that script component has been added to a GameObject. This way script variables can be changed without modifying the script itself.

Furthermore, the Inspector is used for showing import options of assets such as textures, 3D models, and fonts when selected. Some scene and project-wide settings are also viewed in the Inspector, such as all the Settings Managers.

Any property that is displayed in the Inspector can be directly modified. There are two main types of Properties: Values and References.

Page last updated: 2010-09-14



Editing Value Properties40

Value properties do not reference anything and they can be edited right on the spot. Typical value properties are numbers, toggles, strings, and selection popups, but they can also be colors, vectors, curves, and other types.


Value properties on the inspector can be numbers, checkboxes, strings...

Many value properties have a text field and can be adjusted simply by clicking on them, entering a value using the keyboard, and pressing Enter to save the value.

Some Value Properties open up a small popup dialog that can be used to edit the value.

Color Picker

Properties of the Color type will open up the Color Picker. (On Mac OS X this color picker can be changed to the native OS color picker by enabling Use OS X Color Picker under Unity->Preferences.)

The Color Picker reference in the inspector is represented by:


Color Picker reference in the inspector.

And opens the Color Picker just by clicking on it:


Color Picker descriptions.

Use the Eyedropper Tool when you want to find a value just by putting your mouse over the color you want to grab.
RGB / HSV Selector lets you switch your values from Red, Green, Blue to Hue, Saturation and Value (Strength) of your color.
Finally, the transparency of the Color selected can be controlled by the Alpha Channel value.

Curve Editor

Properties of the AnimationCurve type will open up the Curve Editor. The Curve Editor lets you edit a curve or choose from one of the presets. For more information on editing curves, see the guide on Editing Curves.

The type is called AnimationCurve for legacy reasons, but it can be used to define any custom curve function. The function can then be evaluated at runtime from a script.

An AnimationCurve property is shown in the inspector as a small preview:


A preview of an AnimationCurve in the Inspector.

Clicking on it opens the Curve Editor:


The Curve Editor is for editing AnimationCurves.

Wrapping Mode Lets you select between Ping Pong, Clamp and Loop for the Control Keys in your curve.
The Presets lets you modify your curve to default outlines the curves can have.

Gradient editor

In graphics and animation, it is often useful to be able to blend one colour gradually into another, over space or time. A gradient is a visual representation of a colour progression, which simply shows the main colours (which are called stops) and all the intermediate shades between them. In Unity, gradients have their own special value editor, shown below.

The upward-pointing arrows along the bottom of the gradient bar denote the stops. You can select a stop by clicking on it; its value will be shown in the Color box which will open the standard colour picker when clicked. A new stop can be created by clicking just underneath the gradient bar. The position of any of the stops can be changed simply by clicking and dragging and a stop can be removed with ctrl/cmd + delete.

The downward-pointing arrows above the gradient bar are also stops but they correspond to the alpha (transparency) of the gradient at that point. By default, there are two stops set to 100% alpha (ie, fully opaque) but any number of stops can be added and edited in much the same way as the colour stops.

Page last updated: 2012-08-13



Editing Reference Properties

リファレンス プロパティは、ゲーム オブジェクトやコンポーネント、アセットなどのその他のオブジェクトを参照するプロパティです。 リファレンス スロットは、このリファレンスに使用するオブジェクトの種類を表示します。

オーディオ クリッププロパティ スロットは、オーディオ クリップのオブジェクトへのリファレンスを受け付けることを示します。

現在、オーディオ クリップファイルは、オーディオ クリッププロパティで参照されています。

このタイプのリファレンスは、特にスクリプティング使用時には非常に高速で、強力です。 スクリプトとプロパティの使用については、チュートリアル ページのスクリプティング チュートリアルを参照してください。

オブジェクト リファレンスは、ドラッグ & ドロップか、Object Picker を使用するかのいずれかで、リファレンス プロパティに割り当てることができます。

ドラッグ & ドロップ

シーン ビュー、ヒエラルキー、またはプロジェクト ビューで対象のオブジェクトを選択して、リファレンス プロパティのスロットにドラッグするだけで、ドラッグ & ドロップを使用できます。

リファレンス プロパティが特定のコンポーネント タイプ (トランスフォームなど) を受け付ける場合、ゲーム オブジェクトまたはプレハブが正しいタイプのコンポーネントを含んでいれば、そのゲーム オブジェクトまたはプレハブをリファレンス プロパティにドラッグすると、うまくいきます。 ドラッグしたのはゲーム オブジェクトまたはプレハブですが、このプロパティが対象のコンポーネントを参照します。

オブジェクトをリファレンス プロパティにドラッグし、そのオブジェクトが正しいタイプでない場合、または正しいコンポーネントを含まない場合、そのオブジェクトをリファレンス プロパティに割り当てることはできません。

Object Picker

リファレンス スロットの隣にある小さい対象アイコンをクリックして、Object Picker を開くことができます。


エディタから Object Picker へのリファレンス。

Object Picker は、インスペクタ内のオブジェクトを割り当てる小さなウィンドウで、その前に、オブジェクトをプレビュー表示したり、使用可能にすることができます。

Object Picker は、非常に使いやすいのですが、気を付けなければならない点が幾つかあります。 以下に説明します。


Object Picker の構造
  1. Search: ピッカーにオブジェクトがたくさんある場合、Search フィールドを使用してフィルタリングできます。 この検索フィールドで、Labels を使用してオブジェクトを検索できます。
  2. View Selector: シーンとアセット内のオブジェクト間の検索のベースを切り替えます。
  3. Preview Size: この水平スクロール バーで、プレビュー ウィンドウ内でのプレビュー オブジェクトのサイズを大きくしたり、小さくしたりできます。 これにより、いつでもプレビュー ウィンドウでのオブジェクトを数を増やしたり、減らしたりできます。
  4. Preview Window: Search フィールドでフィルタリングされた Scene/Assets folder 内のすべてのオブジェクトが表示されます。
  5. Object Info: 現在選択されているオブジェクトに関する情報を表示します。 このフィールドの内容は、表示されているオブジェクトの種類によって決まるため、例えば、メッシュを選択すると、頂点と三角形の数、UV があるか、スキンされているかが表示されます。 しかし、オーディオ ファイルを選択すると、オーディオのビット レートや長さなどの情報が表示されます。
  6. Object Preview: これも表示されているオブジェクトの種類によって決まります。 メッシュを選択すると、メッシュの外観を表示しますが、スクリプト ファイルを選択すると、ファイルのアイコンのみを表示します。

Object Picker は、プロジェクト内のアセットで機能しますが、これには、ビデオや歌、地形、GUI スキン、スクリプティング ファイル、またはメッシュなどが該当します。頻繁に使うツールです。

ヒント

Page last updated: 2012-11-13



Multi-Object Editing

Starting in Unity 3.5 you can select multiple objects of the same type and edit them simultaneously in the Inspector. Any changed properties will be applied to all of the selected objects. This is a big time saver if you want to make the same change to many objects.

When selecting multiple objects, a component is only shown in the Inspector if that component exists on all the selected objects. If it only exists on some of them, a small note will appear at the bottom of the Inspector saying that components that are only on some of the selected objects cannot be multi-edited.

Property Values

When multiple objects are selected, each property shown in the Inspector represents that property on each of the selected objects. If the value of the property is the same for all the objects, the value will be shown as normal, just like when editing a single object. If the value of the property is not the same for all the selected objects, no value is shown and a dash or similar is shown instead, indicating that the values are different.

Multi-edit of two objects

Regardless of whether a value is shown or a dash, the property value can be edited as usual and the changed value is applied to all the selected objects. If the values are different and a dash is thus shown, it's also possible to right-click on the label of the property. This brings up a menu that lets you choose from which of the objects to inherit the value.

Selecting which object to get the value from

Multi-Editing Prefab or Model Instances

Prefabs can be multi-edited just like Game Objects in the scene. Instances of prefabs or of models can also be multi-edited; however certain restrictions apply: When editing a single prefab or model instance, any property that is different from the prefab or model will appear in bold, and when right clicking there's an option to revert the property to the value it has in the prefab or model. Furthermore, the Game Object has options to apply or revert all changes. None of these things are available when multi-object editing. Properties cannot be reverted or applied; nor will they appear in bold if different from the prefab or model. To remind you of this, the Inspector will show a note with Instance Management Disabled where the Select, Revert, and Apply buttons would normally appear.

Instance Managment Disabled for multi-edit of prefabs

Non-Supported Objects

A few object types do not support multi-object editing. When you select multiple objects simultaneously, these objects will show a small note saying "Multi-object editing not supported".

If you have made a custom editor for one of your own scripts, it will also show this message if it doesn't support multi-object editing. See the script reference for the Editor class to learn how to implement support for multi-object editing for your own custom editors.

Page last updated: 2012-01-23



Inspector Options

インスペクタ ロックおよびインスペクタ デバッグ モードは、ワークフローで役に立つ便利な 2 つのオプションです。

ロック

ロックにより、インスペクタの GameObject へのフォーカスを維持しつつ、その他の GameObject を選択できます。 インスペクタのロックを切り替えるには、インスペクタ上の「lock/unlock」 () アイコンをクリックするか、タブ メニューを開き、Lock を選択します。


「タブ メニューからインスペクタをロックします。」

複数のインスペクタを開くことができ、例えば、インスペクタの 1 つを GameObject にロックし、他方の GameObject をロックされていない状態にして、どの GameObject が選択されているかを表示できます。

デバッグ

デバッグ モードにより、インスペクタで、通常は表示されないコンポーネントのプライベート変数を検査できます。 デバッグ モードに変更するには、タブ メニューを開いて、Debug を選択します。

デバッグ モードで、すべてのコンポーネントは、通常モードで一部のコンポーネントが使用するカスタムのインターフェースではなく、デフォルトのインターフェースを使用して表示されます。 例えば、デバッグ モードでのトランスフォーム コンポーネントは、通常モードで表示されるオイラー角ではなく、回転のそのままの Quaternion 値を表示します。 また、デバッグ モードを使用して、自身のスクリプト コンポーネントのプライベート変数の値を検査できます。


「インスペクタのデバッグ モードにより、スクリプトおよびその他のコンポーネントでプライベート変数を検査できます。」

デバッグ モードは、インスペクタごとで、デバッグ モードでは 1 つのインスペクタを持つことができますが、別のインスペクタを持つことはできません。

Page last updated: 2012-11-09



Using The Scene View

The Scene View is your interactive sandbox. You will use the Scene View to select and position environments, the player, the camera, enemies, and all other GameObjects. Maneuvering and manipulating objects within the Scene View are some of the most important functions in Unity, so it's important to be able to do them quickly.

Page last updated: 2010-09-06



Scene View Navigation

The Scene View has a set of navigation controls to help you move around quickly and efficiently.

Arrow Movement

You can use the Arrow Keys to move around the scene as though "walking" through it. The up and down arrows move the camera forward and backward in the direction it is facing. The left and right arrows pan the view sideways. Hold down the Shift key with an arrow to move faster.

Focusing

If you select a GameObject in the hierarchy, then move the mouse over the scene view and press the F key, the view will move so as to center on the object. This feature is referred to as frame selection.

Move, Orbit and Zoom

Moving, orbiting and zooming are key operations in Scene View navigation, so Unity provides several alternative ways to perform them for maximum convenience.

Using the Hand Tool

When the hand tool is selected (shortcut: Q), the following mouse controls are available:

Move: Click-drag to drag the camera around.
Orbit: Hold Alt and click-drag to orbit the camera around the current pivot point.
Zoom: Hold Control (Command on Mac) and click-drag to zoom the camera.

Holding down Shift will increase the rate of movement and zooming.

Shortcuts Without Using the Hand Tool

For extra efficiency, all of these controls can also be used regardless of which transform tool is selected. The most convenient controls depend on which mouse or track-pad you are using:

Action3-button mouse2-button mouse or track-padMac with only one mouse button or track-pad
MoveHold Alt and middle click-drag.Hold Alt-Control and click-drag.Hold Alt-Command and click-drag.
OrbitHold Alt and click-drag.Hold Alt and click-drag.Hold Alt and click-drag.
ZoomHold Alt and right click-drag or use scroll-wheel.Hold Alt and right click-drag.Hold Alt-Control and click-drag or use two-finger swipe.

Flythrough Mode

The Flythrough mode lets you navigate the Scene View by flying around in first person similar to how you would navigate in many games.

Flythrough mode is designed for Perspective Mode. In Isometric Mode, holding down the right mouse button and moving the mouse will orbit the camera instead.

Scene Gizmo

In the upper-right corner of the Scene View is the Scene Gizmo. This displays the Scene View Camera's current orientation, and allows you to quickly modify the viewing angle.

You can click on any of the arms to snap the Scene View Camera to that direction. Click the middle of the Scene Gizmo, or the text below it, to toggle between Isometric Mode and Perspective Mode. You can also always shift-click the middle of the Scene Gizmo to get a "nice" perspective view with an angle that is looking at the scene from the side and slightly from above.


Perspective mode.

Isometric mode. Objects do not get smaller with distance here!

Mac Trackpad Gestures

On a Mac with a trackpad, you can drag with two fingers to zoom the view.

You can also use three fingers to simulate the effect of clicking the arms of the Scene Gizmo: drag up, left, right or down to snap the Scene View Camera to the corresponding direction. In OS X 10.7 "Lion" you may have to change your trackpad settings in order to enable this feature:

Page last updated: 2012-11-26



Positioning GameObjects

ゲーム作成時に、ゲームの世界に各種オブジェクトを置くことになるでしょう。

フォーカシング

操作前にシーン ビュー カメラのフォーカスをオブジェクトに合わせると便利な場合があります。 GameObject を選択して、F キーを押します。 これにより、選択範囲でシーン ビューおよびピボット点が中央に配置されます。 これはフレーム選択とも呼ばれます。

移動、回転および縮小拡大

ツールバーのトランスフォーム ツールを使用して、個々の GameObject を移動、回転、縮小拡大します。 それぞれに、シーンビュー内で選択した GameObject 周辺に表示される対応するギズモがあります。 マウスを使用し、ギズモ軸を操作して、GameObject の Transform コンポーネントを変更したり、またはインスペクタのトランスフォーム コンポーネントの数字フィールドに直接値を入力できます。Each of the three transform modes can be selected with a hotkey - W for Translate, E for Rotate and R for Scale.

Gizmo Display Toggles

Gizmo Display Toggles は、トランスフォーム ギズモの位置を定義します。


Gizmo Display Toggles

Unit Snapping

移動ツールを使用して、ギズモ軸をドラッグしたまま、Control キー (Mac の場合、Command) を押したままにすると、Snap Settings で定義された増分にスナップできます。

メニュー Edit->Snap Settings... を使用して、単位スナッピングに使用される単位距離を変更できます。


シーン ビューの単位スナッピング設定

Surface Snapping

移動ツールを使用して、中心でドラッグしたまま、Shift と Control キー (Mac の場合、Command) を押したままにすると、Collider の交差部にオブジェクトをスナップできます。 これによって、オブジェクトを驚くほど素早く正確に配置できます。

Look-At Rotation

回転ツールを使用して、Shift と Control キー (Mac の場合、Command) を押したままにすると、Collider の表面の点に対して、オブジェクトを回転できます。 これにより、それぞれに対するオブジェクトの方向をシンプルできます。

Vertex Snapping

Vertex Snapping 機能を使用して、世界をより簡単に組み立てることができます。 この機能は非常にシンプルですが、強力な Unity のツールです。 これにより、所定のメッシュから頂点を取り、マウスで、その頂点を選択したほかのメッシュからの頂点と同じ位置に配置できます。

この機能を使えば、世界を実に素早く組み立てることができます。 例えば、レース ゲームで高精度の道路を置いたり、メッシュの頂点にパワアップアイテムを追加したりできます。


Vertex Snapping での道路の組み立て

Unityでの Vertex Snapping の使用は非常に簡単です。 次の手順に従うだけで使用できます。

Vertex snapping の使用法を映したビデオは、here にあります。

Page last updated: 2012-11-13



View Modes

The Scene View control bar lets you choose various options for viewing the scene and also control whether lighting and audio are enabled. These controls only affect the scene view during development and have no effect on the built game.


Draw Mode

The first drop-down menu selects which Draw Mode will be used to depict the scene.


Draw Mode drop-down

Render Mode

The next drop-down along selects which of four Render Modes will be used to render the scene.


Render Mode drop-down

Scene Lighting, Game Overlay, and Audition Mode

To the right of the dropdown menus are three buttons which control other aspects of the scene representation.

The first button determines whether the view will be lit using a default scheme or with the lights that have actually been added to the scene. The default scheme is used initially but this will change automatically when the first light is added. The second button controls whether skyboxes and GUI elements will be rendered in the scene view and also shows and hides the placement grid. The third button switches audio sources in the scene on and off.

Page last updated: 2011-11-10



Gizmo and Icon Visibility

Gizmos and icons have a few display options which can be used to reduce clutter and improve the visual clarity of the scene during development.

The Icon Selector

Using the Icon Selector, you can easily set custom icons for GameObjects and scripts that will be used both in the Scene View and the Inspector. To change the icon for a GameObject, simply click on its icon in the Inspector. The icons of script assets can be changed in a similar way. In the Icon Selector is a special kind of icon called a Label Icon. This type of icon will show up in the Scene View as a text label using the name of the GameObject. Icons for built-in Components cannot be changed.

Note: When an asset's icon is changed, the asset will be marked as modified and therefore picked up by Revision Control Systems.

Selecting an icon for a GameObject


Selecting an icon for a script

Showing and Hiding Icons and Gizmos

The visibility of an individual component's gizmos depends on whether the component is expanded or collapsed in the inspector (ie, collapsed components are invisible). However, you can use the Gizmos dropdown to expand or collapse every component of a given type at once. This is a useful way to reduce visual clutter when there are a large number of gizmos and icons in the scene.

To show the state of the current gizmo and icon, click on Gizmos in the control bar of the Scene or Game View. The toggles here are used to set which icons and gizmos are visible.

Note that the scripts that show up in the Scripts section are those that either have a custom icon or have an OnDrawGizmos () or OnDrawGizmosSelected () function implemented.


The Gizmos dropdown, displaying the visibility state of icons and gizmos

The Icon Scaling slider can be used to adjust the size used for icon display in the scene. If the slider is placed at the extreme right, the icon will always be drawn at its natural size. Otherwise, the icon will be scaled according to its distance from the scene view camera (although there is an upper limit on the display size in order that screen clutter be avoided).

Page last updated: 2012-11-15



Searching

When working with large complex scenes it can be useful to search for specific objects. By using the Search feature in Unity, you can filter out only the object or group of objects that you want to see. You can search assets by their name, by Component type, and in some cases by asset Labels. You can specify the search mode by choosing from the Search drop-down menu.

Scene Search

When a scene is loaded in the Editor, you can see the objects in both the Scene View and the Hierarchy. The specific assets are shared in both places, so if you type in a search term (eg, "elevator"), you'll see the the filter applied both visually in the Scene View and a more typical manner in the Hierarchy. There is also no difference between typing the search term into the search field in the Scene View or the Hierachy -- the filter takes effect in both views in either case.


Scene View and Hierarchy with no search applied.

Scene View and Hierarchy with active filtering of search term.

When a search term filter is active, the Hierarchy doesn't show hierarchical relationships between GameObjects, but you can select any GameObject, and it's hierarchical path in the scene will be shown at the bottom of the Hierarchy.

Click on a GameObject in the filtered list to see its hierarchical path.

When you want to clear the search filter, just click the small cross in the search field.

In the Scene search you can search either by Name or by Type. Click on the small magnifying glass in the search field to open the search drop-down menu and choose the search mode.

Search by Name, Type, or All.

Project Search

The same fundamentals apply to searching of assets in the Project View -- just type in your search term and you'll see all the relevant assets appear in the filter.

In the Project search you can search by Name or by Type as in the Scene search, and additionally you can search by Label. Click on the small magnifying glass in the search field to open the search drop-down menu and choose the search mode.

Search by Name, Type, Label, or All.

Object Picker Search

When assigning an object via the Object Picker, you can also enter a search term search to filter the objects you want to see.

Page last updated: 2011-11-10



Prefabs

A Prefab is a type of asset -- a reusable GameObject stored in Project View. Prefabs can be inserted into any number of scenes, multiple times per scene. When you add a Prefab to a scene, you create an instance of it. All Prefab instances are linked to the original Prefab and are essentially clones of it. No matter how many instances exist in your project, when you make any changes to the Prefab you will see the change applied to all instances.

Creating Prefabs

In order to create a Prefab, simply drag a GameObject that you've created in the scene into the Project View. The GameObject's name will turn blue to show that it is a Prefab. You can rename your new Prefab.

After you have performed these steps, the GameObject and all its children have been copied into the Prefab data. The Prefab can now be re-used in multiple instances. The original GameObject in the Hierarchy has now become an instance of the Prefab.

Prefab Instances

To create a Prefab instance in the current scene, drag the Prefab from the Project View into the Scene or Hierarchy View. This instance is linked to the Prefab, as displayed by the blue text used for their name in the Hierarchy View.

Three of these GameObjects are linked to Prefabs. One of them is not.

Inheritance

Inheritance means that whenever the source Prefab changes, those changes are applied to all linked GameObjects. For example, if you add a new script to a Prefab, all of the linked GameObjects will instantly contain the script as well. However, it is possible to change the properties of a single instance while keeping the link intact. Simply change any property of a prefab instance, and watch as the variable name becomes bold. The variable is now overridden. All overridden properties will not be affected by changes in the source Prefab.

This allows you to modify Prefab instances to make them unique from their source Prefabs without breaking the Prefab link.


A linked GameObject with no overrides enabled.

A linked GameObject with several (bold) overrides enabled.

Imported Prefabs

When you place a mesh asset into your Assets folder, Unity automatically imports the file and generates something that looks similar to a Prefab out of the mesh. This is not actually a Prefab, it is simply the asset file itself. Instancing and working with assets introduces some limitations that are not present when working with normal Prefabs.


Notice the asset icon is a bit different from the Prefab icons

The asset is instantiated in the scene as a GameObject, linked to the source asset instead of a normal Prefab. Components can be added and removed from this GameObject as normal. However, you cannot apply any changes to the asset itself since this would add data to the asset file itself! If you're creating something you want to re-use, you should make the asset instance into a Prefab following the steps listed above under "Creating Prefabs".

Page last updated: 2012-09-15



Lights

Lights は、すべてのシーンの重要な部分です。 メッシュおよびテクスチャは、シーンの形状と見た目を定義し、ライトは 3D 環境の色やムードを定義します。 各シーンで複数のライトを扱うことになるでしょう。 これらを連携させるには、少しの実践が必要ですが、結果は驚くほど素晴らしいものになるでしょう。

簡単な、2 つのライトのある設定

ライトは、GameObject->Create Other メニューからシーンに追加できます。 ライトを追加すると、その他の GameObject のように操作できます。 また、Component->Rendering->Light を使用して、選択した GameObject にライト コンポーネントを追加できます。

Inspector のライト コンポーネントには多くの各種オプションがあります。

インスペクタ内のライト コンポーネント プロパティ

ライトのColorを変更するだけで、シーンのムード全体を変えることができます。

明るい太陽のようなライト

暗い中世のライト

不気味な夜のライト

このように作成したライトは、realtime ライトになります。そのライティングは、ゲーム実行中、フレームごとに計算されます。 ライトが変わらないと分かっている場合、Lightmapping を使用して、ゲームをより高速かつ、見た目も向上できます。

レンダリング パス

Unity は、各種レンダリング パスをサポートしています。これらのパスは、主にライトやシャドウに影響するため、ゲームの要件に基づいた正しいレンダリング パスの選択により、プロジェクトのパフォーマンスを向上できます。 レンダリング パスの詳細については、 Rendering paths section を参照してください。

詳細

ライトの使用に関する詳細については、Reference ManualLights page を参照してください。

Page last updated: 2012-11-13



Cameras

Just as cameras are used in films to display the story to the audience, Cameras in Unity are used to display the game world to the player. You will always have at least one camera in a scene, but you can have more than one. Multiple cameras can give you a two-player splitscreen or create advanced custom effects. You can animate cameras, or control them with physics. Practically anything you can imagine is possible with cameras, and you can use typical or unique cameras to fit your game's style.

The remaining text is from the Camera Component reference page.

カメラ

Cameras は、世界を切り取り、プレイヤーに表示する装置です。 カメラをカスタマイズし、操作することで、本当に自分なりの表現を行うことができます。 シーン内では、カメラを好きなだけ使用できます。 レンダリングの順序や、スクリーン上の位置、又は、スクリーンの一部だけを表示するように設定することも可能です。


「Unity の柔軟なカメラ オブジェクト」

プロパティ

Clear Flags画面のどの部分をクリアするかを決定します。 複数のカメラを使用して、異なるゲーム要素を描画する際に便利です。
Backgroundビュー内のすべての要素が描画され、スカイボックスがない場合には、残りの画面に適用される色。
Culling Maskカメラによってレンダリングされるオブジェクトのレイヤーを含めたり、取り除いたりします。 インスペクタのオブジェクトにレイヤーを割り当てます。
Projection景色をシミュレートするカメラの機能を切り替えます。
Perspectiveカメラがそのままの景色でオブジェクトをレンダリングします。
Orthographicカメラが景色感なしで、オブジェクトを均一にレンダリングします。
Size (Orthographic を選択しない場合)Orthographic に設定した場合のカメラのビューポイントのサイズ。
Field of viewローカルな Y 軸に沿って測定された (単位: °)、カメラのビュー角度の幅。
Clipping Planesレンダリングを開始および停止するカメラからの距離。
Near描画が行われるカメラに対して最も近い点。
Far描画が行われるカメラに対して最も遠い点。
Normalized View Port Rect画面座標内で画面上でこのカメラ ビューが描画される場所を示す 4 つ値 (値 0-1)。
Xカメラ ビューが描画される開始の水平位置。
Yカメラ ビューが描画される開始の垂直位置。
W (Width)画面上のカメラの出力の幅。
H (Height)画面上のカメラの出力の高さ。
Depth描画順でのカメラの位置。 大きい値のカメラが、小さい値のカメラの上に描画されます。
Rendering Pathカメラが使用するレンダリング方法を定義するオプション。
Use Player Settingsこのカメラは、プレイヤー設定でいずれの Rendering Path されても使用します。
Vertex Litこのカメラでレンダリングされたオブジェクトはすべて、Vertex-Lit オブジェクトとしてレンダリングされます。
ForwardUnity 2.x で標準であったように、すべてのオブジェクトがマテリアルごとに 1 つのパスでレンダリングされます。
Deferred Lighting (Unity Pro のみ)ライティングなしで、すべてのオブジェクトが 1 回びゅおがされ、すべてのオブジェクトのライティングがレンダリング キューの最後で一緒にレンダリングされます。
Target Texture (Unity Pro/Advanced のみ)カメラ ビューのRender Texture への参照。 この参照を作成すると、この画面に対して、このカメラをレンダリングする機能が無効になります。
HDRカメラでハイダイナミックレンジ レンダリングをオンにします。

詳細

カメラは、プレイヤーにゲームを表示するのに重要です。 想像できるあらゆる種類の効果が得られるよう、カメラをカスタマイズ、記述またはパレンディングできます。 パズル ゲームの場合、パズル全体のビューに対して、カメラを静止に維持できます。 1 人称シューティングの場合、カメラをプレイヤーのキャラクターにパレンディングして、キャラクターの目線に配置することができます。 レース ゲームの場合、カメラがプレイヤーの後を追うように動かすことができます。

複数のカメラを作成し、それぞれに異なる「Depth」を割り当てることができます。 低い「Depth」から高い「Depth」にカメラが描画されます。 言い換えると、「Depth」が 2 のカメラは、「Depth」が 1 のカメラの上に描画されます。「Normalized View Port Rectangle」プロパティの値を調整して、画面上のカメラのビューのサイズ変更や配置を行うことができます。 これにより、ミサイル カムや、マップ ビュー、バック ミラーのような複数の小さいビューを作成できます。

レンダリング パス

Unity は、異なるレンダリング パスをサポートしています。 ゲームの内容や対象のプラットフォーム / ハードウェアに応じて、どのパスを使用するかを選ぶ必要があります。 レンダリング パスによって、主に光や影に影響する機能およびパフォーマンス特性が異なります。 プロジェクトに使用されるレンダリング パスはプレイヤー設定で選択されます。 さらに、各カメラに対して、レンダリング パスを無効にできます。

レンダリング パスの詳細については、rendering paths page を参照してください。

Clear Flags

各カメラは、そのビューをレンダリングする際に、色と深さに関する情報を記憶します。 描画されない画面の部分は空で、デフォルトではスカイボックスが表示されます。 複数のカメラを使用する場合、それぞれのカメラがバッファに色と深さに関する情報を記憶し、各カメラがレンダリングを行う際に、より多くのデータを蓄積します。 シーンで任意のカメラがそのビューをレンダリングするため、「Clear Flags」を設定して、バッファ情報の異なる集合をクリアできます。 これは、次の 4 つのオプションを選択することで行うことができます。

スカイボックス

これはデフォルトの設定です。 画面の空白の部分には、現在のカメラのスカイボックスが表示されます。 現在のカメラにスカイボックスが設定されない場合、Render SettingsEdit->Render Settings で表示) で選択したスカイボックスに戻ります。 これにより、「Background Color」に戻ります。 そうでない場合は、Skybox component をカメラに追加できます。 スカイボックスを新規作成したい場合は、you can use this guide を参照してください。

Solid Color

画面の空白の部分には、現在のカメラの「Background Color」が表示されます。

Depth Only

環境内でプレイヤーの銃を切り取らずに描画したい場合、1 台のカメラに対して「Depth」を 0 に設定して、環境を描画し、もう 1 台のカメラの「Depth」を 1 に設定して、武器のみを描画させます。 武器を表示するカメラの「Clear Flags」は、「Depth only」に設定する必要があります。 これにより、環境の映像表示が画面に維持されますが、各オブジェクトが 3D スペースに存在する場所に関する情報はすべて破棄されます。 銃が描画されると、銃がどの程度壁に近いかに関係なく、不透明な部分が描画されたものをすべて完全に覆います。


「銃は、カメラの深さバッファが描画前にクリアされた後に、最後に描画されます」

Don't Clear

このモードでは、色および深さバッファのいずれもクリアされません。 その結果、各フレームが次のフレーム上に描画され、シミのように見える効果が得られます。 これは通常ゲームでは使用されず、カスタムのシェーダーと併用される場合に最適です。

Clip Planes

「Near」と「Far Clip Plane」プロパティは、カメラのビューの開始および終了場所を決定します。 カメラの方向に対して垂直に面が配置され、その位置から測定されます。 「Near plane」は、レンダリングされる最も近い場所で、「Far plane」は最も遠い場所になります。

また、切り取り面は、バッファの精度が画面上にどのように分配されるかを決定します。 一般に、精度を高めるには、「Near plane」をできる限り遠くに移動する移動させる必要があります。

近くまたは遠くの切り取り面は、カメラのビューのフィールドで定義された面と共に、一般的にカメラの「錐台」と知られているものを記述します。 Unity では、オブジェクトをレンダリングする際に、この錐台外にあるオブジェクトは表示されません。 これは、錐台カリングと呼ばれます。 錐台カリングは、ゲーム内でオクルージョン カリングが使用されているか否かに関係なく発生します。

パフォーマンス上の理由から、より小さいオブジェクトを早めに間引きたい場合があるでしょう。 例えば、小さい岩や破片を大きい建物よりもより少ない距離で非表示にできます。 これを行うには、小さいオブジェクトを separate layer に置き、Camera.layerCullDistances スクリプト機能を使用して、レイヤーごとの間引き距離を設定できます。

Culling Mask

「Culling Mask」は、レイヤーを使用してオブジェクトのグループを選択的にレンダリングするのに使用されます。 * レイヤーの使用法については、here を参照してください。

異なるレイヤーにユーザー インターフェースを配置し、UI レイヤー自体で個々のカメラでユーザー インターフェース自体をレンダリングするのが一般的です。

また、UI をその他のカメラ ビューの上に表示するには、「Clear Flags」を「Depth only」に設定し、UI カメラの「Depth」をその他のカメラよりも高くする必要があります。

Normalized Viewport Rectangle

「Normalized Viewport Rectangles」は、現在のカメラ ビューが描画される画面の一定の部分を定義するためのものです。 画面の左下ににマップ ビューを、右上にミサイルチップ ビューを配置できます。 少し設計を行うだけで、「Viewport Rectangle」を使用して、独自の動作を作成できます。

「Normalized Viewport Rectangle」を使用して、2 プレイヤー用に 2 分割した画面効果を簡単に作成できます。 2 台のカメラを作成後、カメラの H 値を 0.5 に設定し、プレイヤー 1 の Y 値を 0.5 に、プレイヤー 2 の Y 値を 0 に変更します。これにより、プレイヤー 1 のカメラが画面の半分上から上部に表示され、プレイヤー 2 のカメラが下部で始まり、画面の半分で停止します。


「「Normalized Viewport Rectangle」で作成された 2 プレイヤー用表示」

Orthographic

カメラを「Orthographic」にすると、カメラ のビューからすべての景色が削除されます。 これは、等角または 2D ゲームの作成に便利です 。

霧は Orthographic カメラ モードで均一にレンダリングされるため、期待通りには表示されません。 理由については、component reference on Render Settings を参照してください。


「Perspective カメラ」

「Orthographic カメラ」 オブジェクトはここでの距離で小さくなりません。」

Render Texture

この機能は、Unity Advance ライセンスでのみ使用できます。 カメラのビューを、別のオブジェクトに適用される Texture に配置します。 これにより、競技場のビデオ モニターや監視カメラ、反射などを簡単に作成できます。


「アリーナの実況カメラの作成に使用される Render Texture」

ヒント

Page last updated: 2007-11-16



Terrains

このセクションではTerrain Engine(地形エンジン)の使用方法について説明します。作成の方法、技術的な詳細、その他の考慮事項をカバーします。次のセクションに分かれています:

Using Terrains

このセクションではTerrainの基本的な情報についてカバーします。これはTerrainの作成方法と新しいTerrainツールおよびブラシの使用方法を含みます。

Height

このセクションでは異なるツールおよびブラシを使用してTerrain(地形)のHeight(高さ)を変更する方法を説明します。

Terrain Textures

このセクションでは異なるブラシを使用して、Terrainテクスチャを追加、ペイント、ブレンドする方法を説明します。

Trees

このセクションではツリーアセットを作成する際に重要な情報を説明します。さらにTerrain上でツリーを追加、ペイントする方法もカバーします。

Grass

このセクションではGrass(草)の仕組みと使用方法を説明します。

Detail Meshes

このセクションでは詳細メッシュ(岩、ワラ、植生)の実践的な使用方法を説明します。

Lightmaps

Unity内臓のLightmapperにより他のどのようなオブジェクトとも同じようにTerrainにライトマップを適用することが出来ます。ヘルプが必要な場合はLightmap クイックスタート を参照のこと。

他の設定

このセクションではTerrainに関するその他全ての設定をカバーします。

モバイル パフォーマンスに関する留意事項

Terrainのレンダリングは相当にコストがかかるため、ローエンドのモバイルデバイスでは実用的ではありません。

Page last updated: 2010-06-03



Asset Import and Creation

A large part of making a game is utilizing your asset source files in your GameObjects. This goes for textures, models, sound effects and behaviour scripts. Using the Project View inside Unity, you have quick access to all the files that make up your game:


The Project View displays all source files and created Prefabs

This view shows the organization of files in your project's Assets folder. Whenever you update one of your asset files, the changes are immediately reflected in your game!

To import an asset file into your project, move the file into (your Project folder)->Assets in the Finder, and it will automatically be imported into Unity. To apply your assets, simply drag the asset file from the Project View window into the Hierarchy or Scene View. If the asset is meant to be applied to another object, drag the asset over the object.

Hints

Continue reading for more information:

Page last updated: 2012-01-08



Importing Assets

Unity will automatically detect files as they are added to your Project folder's Assets folder. When you put any asset into your Assets folder, you will see the asset appear in your Project View.


The Project View is your window into the Assets folder, normally accessible from the file manager

When you are organizing your Project View, there is one very important thing to remember:

Never move any assets or organize this folder from the Explorer (Windows) or Finder (OS X). Always use the Project View!

There is a lot of meta data stored about relationships between asset files within Unity. This data is all dependent on where Unity expects to find these assets. If you move an asset from within the Project View, these relationships are maintained. If you move them outside of Unity, these relationships are broken. You'll then have to manually re-link lots of dependencies, which is something you probably don't want to do.

So just remember to only save assets to the Assets folder from other applications, and never rename or move files outside of Unity. Always use Project View. You can safely open files for editing from anywhere, of course.

Creating and Updating Assets

When you are building a game and you want to add a new asset of any type, all you have to do is create the asset and save it somewhere in the Assets folder. When you return to Unity or launch it, the added file(s) will be detected and imported.

Additionally, as you update and save your assets, the changes will be detected and the asset will be re-imported in Unity. This allows you to focus on refining your assets without struggling to make them compatible with Unity. Updating and saving your assets normally from its native application provides optimum, hassle-free workflow that feels natural.

Asset Types

There are a handful of basic asset types that will go into your game. The types are:

We'll discuss the details of importing each of these file types and how they are used.

Meshes & Animations

Whichever 3D package you are using, Unity will import the meshes and animations from each file. For a list of applications that are supported by Unity, please see this page.

Your mesh file does not need to have an animation to be imported. If you do use animations, you have your choice of importing all animations from a single file, or importing separate files, each with one animation. For more information about importing animations, please see page about Animation Import.

Once your mesh is imported into Unity, you can drag it to the Scene or Hierarchy to create an instance of it. You can also add Components to the instance, which will not be attached to mesh file itself.

Meshes will be imported with UVs and a number of default Materials (one material per UV). You can then assign the appropriate texture files to the materials and complete the look of your mesh in Unity's game engine.

Textures

Unity supports all image formats. Even when working with layered Photoshop files, they are imported without disturbing the Photoshop format. This allows you to work with a single texture file for a very care-free and streamlined experience.

You should make your textures in dimensions that are to the power of two (e.g. 32x32, 64x64, 128x128, 256x256, etc.) Simply placing them in your project's Assets folder is sufficient, and they will appear in the Project View.

Once your texture has been imported, you should assign it to a Material. The material can then be applied to a mesh, Particle System, or GUI Texture. Using the Import Settings, it can also be converted to a Cubemap or Normalmap for different types of applications in the game. For more information about importing textures, please read the Texture Component page.

Sounds

Desktop

Unity features support for two types of audio: Uncompressed Audio or Ogg Vorbis. Any type of audio file you import into your project will be converted to one of these formats.

File Type Conversion

.AIFFConverted to uncompressed audio on import, best for short sound effects.
.WAVConverted to uncompressed audio on import, best for short sound effects.
.MP3Converted to Ogg Vorbis on import, best for longer music tracks.
.OGGCompressed audio format, best for longer music tracks.

Import Settings

If you are importing a file that is not already compressed as Ogg Vorbis, you have a number of options in the Import Settings of the Audio Clip. Select the Audio Clip in the Project View and edit the options in the Audio Importer section of the Inspector. Here, you can compress the Clip into Ogg Vorbis format, force it into Mono or Stereo playback, and tweak other options. There are positives and negatives for both Ogg Vorbis and uncompressed audio. Each has its own ideal usage scenarios, and you generally should not use either one exclusively.

Read more about using Ogg Vorbis or Uncompressed audio on the Audio Clip Component Reference page.

iOS

Unity features support for two types of audio: Uncompressed Audio or MP3 Compressed audio. Any type of audio file you import into your project will be converted to one of these formats.

File Type Conversion

.AIFFImports as uncompressed audio for short sound effects. Can be compressed in Editor on demand.
.WAVImports as uncompressed audio for short sound effects. Can be compressed in Editor on demand.
.MP3Imports as Apple Native compressed format for longer music tracks. Can be played on device hardware.
.OGGOGG compressed audio format, incompatible with the iPhone device. Please use MP3 compressed sounds on the iPhone.

Import Settings

When you are importing an audio file, you can select its final format and choose to force it to stereo or mono channels. To access the Import Settings, select the Audio Clip in the Project View and find the Audio Importer in the Inspector. Here, you can compress the Clip into Ogg Vorbis format, force it into Mono or Stereo playback, and tweak other options, such as the very important Decompress On Load setting.

Read more about using MP3 Compressed or Uncompressed audio on the Audio Clip Component Reference page.

Android

Unity features support for two types of audio: Uncompressed Audio or MP3 Compressed audio. Any type of audio file you import into your project will be converted to one of these formats.

File Type Conversion

.AIFFImports as uncompressed audio for short sound effects. Can be compressed in Editor on demand.
.WAVImports as uncompressed audio for short sound effects. Can be compressed in Editor on demand.
.MP3Imports as MP3 compressed format for longer music tracks.
.OGGNote: the OGG compressed audio format is incompatible with some Android devices, so Unity does not support it for the Android platform. Please use MP3 compressed sounds instead.

Import Settings

When you are importing an audio file, you can select its final format and choose to force it to stereo or mono channels. To access the Import Settings, select the Audio Clip in the Project View and find the Audio Importer in the Inspector. Here, you can compress the Clip into Ogg Vorbis format, force it into Mono or Stereo playback, and tweak other options, such as the very important Decompress On Load setting.

Read more about using MP3 Compressed or Uncompressed audio on the Audio Clip Component Reference page.

Once sound files are imported, they can be attached to any GameObject. The Audio file will create an Audio Source Component automatically when you drag it onto a GameObject.

Page last updated: 2012-10-26



Meshes

3Dモデルをインポートすると、UnityはMeshとして内部的に格納します。MeshはMeshフィルタ コンポーネント を使用してゲームオブジェクトにアタッチする必要があります。Meshを表示できるようにするためには、ゲームオブジェクトにはさらにMeshレンダラ あるいは他の適切なレンダラコンポーネントを持っている必要があります。これらのコンポーネントを使用することで、Meshはレンダラによって使用されるマテリアルどおりの外観で、ゲームオブジェクトの位置に表示されます。


MeshフィルターMeshレンダラ でモデルを画面表示

UnityのMeshインポーターはMeshの生成を制御したり、テクスチャやマテリアルに関連付けるため、多くのオプションを用意しています。これらのオプションは、次のページで説明されています。

Page last updated: 2012-01-20



3D-formats

Unityにメッシュをインポートするには主に2つのファイルからできます:

  1. エクスポートされた3Dファイルフォーマット。例えば .FBX あるいは .OBJ
  2. 3Dアプリケーションの専用ファイル、たとえば.Max または .Blendなど3D Studio Maxか、Blenderのファイル形式をサポートします。

どちらでもUnityにメッシュに取り込むことができますが、どちらを選ぶかにあたって考慮事項があります。

エクスポートされた3Dファイル形式

Unityは.FBX.dae (Collada)、.3DS.dxf および.obj、FBXエクスポータファイルを読み込むことができ、FBXエクスポータはここ 多くのアプリケーションでobjやColladaのエクスポータを見つけることが出来ます。

長所

短所

独自の3Dアプリケーションファイル形式

Unityが変換を通してインポートできるファイル形式: Max MayaBlenderCinema4DModoLightwaveCheetah3D、たとえば.MAX.MB.MAなど

長所

短所

Page last updated: 2012-11-18



Animations (Legacy)

Unity's Animation System allows you to create beautifully animated skinned characters. The Animation System supports animation blending, mixing, additive animations, walk cycle time synchronization, animation layers, control over all aspects of the animation playback (time, speed, blend-weights), mesh skinning with 1, 2 or 4 bones per vertex and finally physically based ragdolls.

For best practices on creating a rigged character with optimal performance in Unity, we recommended that you check out the section on Modeling Optimized Characters.

The following topics are covered on this page:

Importing Animations

Spliting animations

Importing Inverse Kinematics

When importing animated characters from Maya that are created using IK, you have to check the Bake IK & simulation box in the Import Settings. Otherwise, your character will not animate correctly.

Bringing the character into the Scene

When you have imported your model you drag the object from the Project View into the Scene View or Hierarchy View


The animated character is added by dragging it into the scene

The character above has three animations in the animation list and no default animation. You can add more animations to the character by dragging animation clips from the Project View on to the character (in either the Hierarchy or Scene View). This will also set the default animation. When you hit Play, the default animation will be played.

TIP: You can use this to quickly test if your animation plays back correctly. Also use the Wrap Mode to view different behaviors of the animation, especially looping.

Page last updated: 2012-10-04



Materials

There is a close relationship between Materials and Shaders in Unity. Shaders contain code that defines what kind of properties and assets to use. Materials allow you to adjust properties and assign assets.


A Shader is implemented through a Material

To create a new Material, use Assets->Create->Material from the main menu or the Project View context menu. Once the Material has been created, you can apply it to an object and tweak all of its properties in the Inspector. To apply it to an object, just drag it from the Project View to any object in the Scene or Hierarchy.

Setting Material Properties

You can select which Shader you want any particular Material to use. Simply expand the Shader drop-down in the Inspector, and choose your new Shader. The Shader you choose will dictate the available properties to change. The properties can be colors, sliders, textures, numbers, or vectors. If you have applied the Material to an active object in the Scene, you will see your property changes applied to the object in real-time.

There are two ways to apply a Texture to a property.

  1. Drag it from the Project View on top of the Texture square
  2. Click the Select button, and choose the texture from the drop-down list that appears

Two placement options are available for each Texture:

TilingScales the texture along the different.
OffsetSlides the texture around.

Built-in Shaders

There is a library of built-in Shaders that come standard with every installation of Unity. There are over 30 of these built-in Shaders, and six basic families.

In each group, built-in shaders range by complexity, from the simple VertexLit to the complex Parallax Bumped with Specular. For more information about performance of Shaders, please read the built-in Shader performance page

This grid displays a thumbnail of all built-in Shaders:


The builtin Unity shaders matrix

Shader technical details

Unity has an extensive Shader system, allowing you to tweak the look of all in-game graphics. It works like this:

A Shader basically defines a formula for how the in-game shading should look. Within any given Shader is a number of properties (typically textures). Shaders are implemented through Materials, which are attached directly to individual GameObjects. Within a Material, you will choose a Shader, then define the properties (usually textures and colors, but properties can vary) that are used by the Shader.

This is rather complex, so let's look at a workflow diagram:

On the left side of the graph is the Carbody Shader. 2 different Materials are created from this: Blue car Material and Red car Material. Each of these Materials have 2 textures assigned; the Car Texture defines the main texture of the car, and a Color FX texture. These properties are used by the shader to make the car finish look like 2-tone paint. This can be seen on the front of the red car: it is yellow where it faces the camera and then fades towards purple as the angle increases. The car materials are attached to the 2 cars. The car wheels, lights and windows don't have the color change effect, and must hence use a different Material. At the bottom of the graph there is a Simple Metal Shader. The Wheel Material is using this Shader. Note that even though the same Car Texture is reused here, the end result is quite different from the car body, as the Shader used in the Material is different.

To be more specific, a Shader defines:

A Material defines:

Shaders are meant to be written by graphics programmers. They are created using the ShaderLab language, which is quite simple. However, getting a shader to work well on a variety graphics cards is an involved job and requires a fairly comprehensive knowledge of how graphics cards work.

A number of shaders are built into Unity directly, and some more come in the Standard Assets Library. If you like, there is plenty more shader information in the Built-in Shader Guide.

Page last updated: 2010-09-16



Textures

テクスチャ 2D

Texture により、MeshParticle やインターフェースがより活気付きます。 これらは重ねたり、オブジェクト周辺にラップする画像やムービー ファイルになります。 これらは非常に重要であるため、多くのプロパティを有しています。 初めてこれをレンダリングする場合は、Details に移動し、参照が必要な場合は、実際の設定に戻ります。

オブジェクトに使用するシェーダが必要なテクスチャに関する要件を追加しますが、画像ファイルをプロジェクト内に置くことができるということが基本的な原理です。 サイズ要件 (以下に記載) を満たすと、インポートされ、ゲームでの使用向けに最適化されます。 これにより、マルチ レイヤー Photoshop または TIFF ファイルに拡張され、インポート時に平坦化されるため、ゲームに対するサイズのペナルティはありません。

プロパティ

Texture Inspector は、その他ほとんどのものと見た目が若干ことなります。

上の部分には幾つかの設定が、下の部分には、Texture Importer とテクスチャ プレビューが含まれます。

テクスチャ インポータ

テクスチャはすべて Project Folder 内の画像からきます。 どのようにインポートされるかは、Texture Importer によって指定されます。 Project View でファイル テクスチャを選択し、InspectorImport Settings を編集することで、テクスチャを変更します。

インスペクタの最上位にあるアイテムがTexture Type メニューでありソース画像ファイルから作成するテクスチャのタイプを選択できます。

Texture Typeテクスチャの目的に応じて、これを選択して、基本的なパラメータを設定できます。
Texture一般に、すべてのテクスチャに使用できる最も一般的な設定です。
Normal Mapこれを選択すると、色をリアルタイムの通常マッピングに適した形式に変換させます。 詳細については、下の Normal Maps を参照してください。
GUIテクスチャを HUD/GUI Control で使用する場合にこれを使用します。
Reflection別名キューブ マップ。テクスチャ上での反射を作成するのに使用します。詳細については、Cubemap Textures を参照してください。
Cookieライトの Cookie に使用する基本パラメータでテキスチャを設定します。
Advancedテクスチャに特定のパラメータを設定し、テクスチャを完全に制御したい場合にこれを選択します。

「選択された基本テクスチャ設定」
Alpha From Grayscale有効にすると、アルファ透過チャンネルが画像の既存の明るさと暗さの値で生成されます。
Wrap Modeテクスチャをタイルしたときの処理
Repeatテクスチャを繰り返しタイル
Clampテクスチャの端をストレッチ
Filter Modeテクスチャが3Dへの変換によりストレッチされたときのフィルタ処理を選択
Pointテクスチャを近くでみたときにブロック状になります
Bilinearテクスチャを近くでみたときにぼやけます
TrilinearBilinearと同じですが、テクスチャはミップマップ間でもぼやけます
Aniso Level急な角度から眺めたときのテクスチャ品質を向上させます。床や地面のテクスチャに良い。 以下 を参照して下さい。

テクスチャインポータの法線マッピング設定
Create from Greyscaleオンにした場合Bumpiness、Filteringオプションが表示されます
Bumpinessバンプの強度を制御します
Filteringバンプの強度を計算する方法を決定します
Smoothスムーズな法線マップを生成します
Sharpゾーベルフィルタとしても知られています。標準よりもシャープな法線マップを生成します。
Wrap Modeテクスチャをタイルしたときの処理
Repeatテクスチャを繰り返しタイル
Clampテクスチャの端をストレッチ
Filter Modeテクスチャが3Dへの変換によりストレッチされたときのフィルタ処理を選択
Pointテクスチャを近くでみたときにブロック状になります
Bilinearテクスチャを近くでみたときにぼやけます
TrilinearBilinearと同じですが、テクスチャはミップマップ間でもぼやけます
Aniso Level急な角度から眺めたときのテクスチャ品質を向上させます。床や地面のテクスチャに良い。 以下 を参照して下さい。

「テクスチャ インポータの GUI 設定」
Filter Modeテクスチャが3Dへの変換によりストレッチされたときのフィルタ処理を選択
Pointテクスチャを近くでみたときにブロック状になります
Bilinearテクスチャを近くでみたときにぼやけます
TrilinearBilinearと同じですが、テクスチャはミップマップ間でもぼやけます

テクスチャインポータのCursor設定
Wrap Modeテクスチャをタイルしたときの処理
Repeatテクスチャを繰り返しタイル
Clampテクスチャの端をストレッチ
Filter Modeテクスチャが3Dへの変換によりストレッチされたときのフィルタ処理を選択
Pointテクスチャを近くでみたときにブロック状になります
Bilinearテクスチャを近くでみたときにぼやけます
TrilinearBilinearと同じですが、テクスチャはミップマップ間でもぼやけます

「テクスチャ インポータの反射設定」
Mappingこれにより、テクスチャがキューブ マップにどのようにマッピングされるかが決まります。
Sphere Mapped「球体状」のキューブ マップにテクスチャをマッピングします。
Cylindricalテクスチャを円柱にマッピングします。円柱のようなオブジェクトに反射を使用したい場合に使用します。
Simple Sphereテクスチャを簡単な球体にマッピングし、回転する際に反射を変形させます。
Nice Sphereテクスチャを球体にマッピングし、回転時に変形させますが、テクスチャのラップは確認できます。
6 Frames Layoutテクスチャは立方体の六つの面にキューブマップのレイアウトを展開し、十字架の形か、列順の画像( +x -x +y -y +z -z)をさらに縦か横か選択できます
Fixup edge seams(ポイントライトのみ)光沢の強い反射光のある画像イメージのエッジのつなぎ目の画像乱れを取り除きます
Filter Modeテクスチャが3Dへの変換によりストレッチされたときのフィルタ処理を選択
Pointテクスチャを近くでみたときにブロック状になります
Bilinearテクスチャを近くでみたときにぼやけます
TrilinearBilinearと同じですが、テクスチャはミップマップ間でもぼやけます
Aniso Level急な角度から眺めたときのテクスチャ品質を向上させます。床や地面のテクスチャに良い。 以下 を参照して下さい。

制御するのにグレースケール テクスチャを使用する方法です。 これは、移動する雲の作成や、密集する葉の印象を与えるのに便利です。 Light ページにこれに関する詳細が全て記載されていますが、テクスチャを使用可能にするには、Texture Type を「Cookie」に設定する必要があります。


「テクスチャ インポータの Cookie 設定」
Light Typeテクスはが適用されるライトの種類。 (スポット ライト、ポイント ライト、ディクショナリ ライトが該当します)。 ディクショナリ ライトの場合、このテクスチャはタイルになるため、テクスチャ インスペクタでは、適切な効果を得るには、スポット ライトに対して、エッジ モードを「Repeat」に設定し、クッキーテクスチャのエッジを黒一色のままにしておく必要があります。 テクスチャ インスペクタで、エッジ モードを「Clamp」に設定する必要があります。
Mappingこれにより、テクスチャがキューブ マップにどのようにマッピングされるかが決まります。
Sphere Mapped「球体状」のキューブ マップにテクスチャをマッピングします。
Cylindricalテクスチャを円柱にマッピングします。円柱のようなオブジェクトに反射を使用したい場合に使用します。
Simple Sphereテクスチャを簡単な球体にマッピングし、回転する際に反射を変形させます。
Nice Sphereテクスチャを球体にマッピングし、回転時に変形させますが、テクスチャのラップは確認できます。
6 Frames Layoutテクスチャは立方体の六つの面にキューブマップのレイアウトを展開し、十字架の形か、列順の画像( +x -x +y -y +z -z)をさらに縦か横か選択できます
Fixup edge seams(ポイントライトのみ)光沢の強い反射光のある画像イメージのエッジのつなぎ目の画像乱れを取り除きます
Alpha From Greyscale有効にすると、アルファ透過チャンネルが画像の既存の明るさと暗さの値で生成されます。

「テクスチャインポータでのライトマップ設定」
Filter Modeテクスチャが3Dへの変換によりストレッチされたときのフィルタ処理を選択
Pointテクスチャを近くでみたときにブロック状になります
Bilinearテクスチャを近くでみたときにぼやけます
TrilinearBilinearと同じですが、テクスチャはミップマップ間でもぼやけます
Aniso Level急な角度から眺めたときのテクスチャ品質を向上させます。床や地面のテクスチャに良い。 以下 を参照して下さい。

「詳細テクスチャ インポータ設定ダイアログ」
Non Power of 2 |テクスチャが 2 のべき乗サイズでない場合、これにより、インポート時のスケーリング動作が定義されます (詳細については、下記の Texture Sizes を参照)。
NoneGUITexture コンポーネントでと併用するのに、テクスチャが次のより大きい2 のべき乗サイズに追加されます。
To nearestテクスチャがインポート時に最も近い 2 のべき乗に縮小拡大されます。 例えば、257x511 のテクスチャは、256x512 になります。 PVRTC 形式には、テクスチャを正方形 (幅と高さが等しい) にする必要があるため、最終的なサイズは 512x512 に拡大されます。
To largerテクスチャがインポート時に次に大きい 2 のべき乗に縮小拡大されます。 例えば、257x511 のテクスチャは、512x512 になります。
To smallerテクスチャがインポート時に次に小さい 2 のべき乗に縮小拡大されます。 例えば、257x511 のテクスチャは、256x256 になります。
Generate Cube Map各種生成方法を使用して、テクスチャからキューブ マップを生成します。
Spheremapテクスチャを球状のキューブマップにマッピング
Cylindricalテクスチャを円柱にマッピング。オブジェクトの反射光を円柱状にしたい場合に使用
SimpleSpheremapテクスチャをシンプルな球にマッピング、回転のときには反射光は崩れます
NiceSpheremapテクスチャを球にマッピング、回転のときには反射光は崩れますがテクスチャはラッピングします
FacesVerticalテクスチャは立方体の六つの面を縦に展開し順序は +x -x +y -y +z -z
FacesHorizontalテクスチャは立方体の六つの面を横に展開し順序は +x -x +y -y +z -z
CrossVerticalテクスチャは立方体の六つの面を縦長の十字架として展開
CrossHorizontalテクスチャは立方体の六つの面を横長の十字架として展開
Read/Write Enabledこれを選択すると、スクリプトからテクスチャ データにアクセスできます (GetPixels、SetPixels と その他の Texture2D 機能)。 しかし、テクスチャデータのコピーが作成され、テクスチャ アセットに必要なメモリ量を 2 倍にします。 本当に必要な場合にのみ使用してください。 非圧縮および DTX 圧縮テクスチャにのみ有効であり、その他の圧縮テクスチャから読み取ることはできません。 デフォルトでは、無効になっています。
Import Type画像データの処理方法
Default標準的なテクスチャ
Normal Mapテクスチャを法線マップとして処理(他のオプションを有効にします)
Lightmapテクスチャをライトマップとして処理(他のオプションを無効にします)
Alpha from grayscale(Defaultのみ)画像の明度情報からアルファチャンネルを生成
Create from grayscale(Normal Mapのみ)画像の明度からマップを生成
Bypass sRGB sampling(Defaultのみ)ガンマ情報を考慮せず、画像の色をそのまま使用(テクスチャがGUIや画像データ以外をエンコードする際に便利)
Generate Mip Mapsこれを選択すると、ミニ マップの生成が有効になります。 ミニ マップはより小さいテクスチャで、テクスチャが画面上で非常に小さい場合に使用されます。 詳細については、下の Mip Maps を参照してください。
In Linear Spaceミップマップをリニアカラー空間で生成する
Border Mip Mapsこれを選択すると、色が下位のミップ レベルの端ににじみ出ることがなくなります。 ライト Cookie (下記参照) に使用されます。
Mip Map Filtering画質を最適化できるミップ マップ フィルタリングには次の 2 つの方法があります。
Boxミップ マップをフェードアウトする最も簡単な方法。ミップ レベルは、サイズが小さくなるに連れ、より滑らかになります。
Kaiser鋭角化カイザー アルゴリズムは、サイズが小さくなるに連れ、ミップ マップで実行されます。 テクスチャが遠くでぼやけが多すぎる場合、このオプションを試してください。
Fade Out Mipsミップ レベルが上がるに連れ、ミップ マップをグレーにフェードするのに、これを有効にします。 これは、詳細マップに使用されます。 一番左のスクロールは、フェードアウトを始める最初のミップ レベルです。 一番右のスクロールは、テクスチャが完全にグレーアウトするミップレベルを定義します。
Wrap Modeテクスチャをタイルしたときの処理
Repeatテクスチャを繰り返しタイル
Clampテクスチャの端をストレッチ
Filter Modeテクスチャが3Dへの変換によりストレッチされたときのフィルタ処理を選択
Pointテクスチャを近くでみたときにブロック状になります
Bilinearテクスチャを近くでみたときにぼやけます
TrilinearBilinearと同じですが、テクスチャはミップマップ間でもぼやけます
Aniso Level急な角度から眺めたときのテクスチャ品質を向上させます。床や地面のテクスチャに良い。 以下 を参照して下さい。

プラットフォームごとの無効化

異なるプラットフォームを作成する場合、対象のプラットフォームに対するテクスチャの解像度やサイズ、画質を考慮する必要があります。 これらのオプションをデフォルト設定にしつつ、特定のプラットフォームで特定の値を割当てることができます。


「すべてのプラットフォーム用のデフォルト設定」
Max Texture Sizeインポートされたテクスチャの最大サイズ。 アーティストは、大きなテクスチャを扱いたい場合が多くあります。これで、テクスチャを適切なサイズに縮小します。
Texture Formatテクスチャに対して使用される内部表示。 サイズと画質間でのトレードオフとなります。 下記の例では、256 x 256 ピクセルのゲーム内テクスチャの最終サイズを示しています。
Compressed圧縮された RGB テクスチャ。 これは、デフューズ テクスチャの最も一般的な形式になります。 ピクセルあたり 4 ビット (256x256 テクスチャの場合は、32 KB)。
16 bit低画質 True Color。 16 段階の赤、緑、青、アルファがあります。
TruecolorTruecolor、最高画質になります。 256x256 テクスチャの場合は、256 KB。

Texture TypeAdvanced に設定している場合、Texture Format は異なる値になります。

デスクトップ

Texture Formatテクスチャに対して使用される内部表示。 サイズと画質間でのトレードオフとなります。 下記の例では、256 x 256 ピクセルのゲーム内テクスチャの最終サイズを示しています。
RGB Compressed DXT1圧縮された RGB テクスチャ。 これは、デフューズ テクスチャの最も一般的な形式になります。 ピクセルあたり 4 ビット (256x256 テクスチャの場合は、32 KB)。
RGBA Compressed DXT5圧縮された RGBA テクスチャ。 これは、デフューズおよびスペキュラ制御テクスチャに使用される主な形式になります。 1 バイト/ピクセル (256x256 テクスチャの場合は、64 KB)。
RGB 16 bitアルファなしの 65,000 色。 圧縮 DXT 形式は、メモリをあまり使用せず、通常は見た目もよくなります。 256x256 テクスチャの場合は、128 KB。
RGB 24 bitアルファなしの TrueColor。 256x256 テクスチャの場合は、192 KB。
Alpha 8 bit色なしの高画質アルファ チャンネル。 256x256 テクスチャの場合は、64 KB。
RGBA 16 bit低画質 True Color。 16 段階の赤、緑、青、アルファがあります。 圧縮 DXT 形式は、メモリをあまり使用せず、通常は見た目もよくなります。 256x256 テクスチャの場合は、128 KB。
RGBA 32 bitアルファのある Truecolor。最高画質になります。 256x256 テクスチャの場合は、256 KBで、費用がかかります。 ほとんどの場合、DXT5は、はるかに小さいサイズで十分な画質を提供します。 DXT 圧縮は目に見える画質損失を生じるため、これは主に法線マップに使用します。

iOS

Texture Formatテクスチャに対して使用される内部表示。 サイズと画質間でのトレードオフとなります。 下記の例では、256 x 256 ピクセルのゲーム内テクスチャの最終サイズを示しています。
RGB Compressed PVRTC 4 bits圧縮された RGB テクスチャ。 これは、デフューズ テクスチャの最も一般的な形式になります。 ピクセルあたり 4 ビット (256x256 テクスチャの場合は、32 KB)。
RGBA Compressed PVRTC 4 bits圧縮された RGBA テクスチャ。 これは、透明性のあるデフューズおよびスペキュラ制御テクスチャに使用される主な形式になります。 ピクセルあたり 4 ビット (256x256 テクスチャの場合は、32 KB)。
RGB Compressed PVRTC 2 bits圧縮された RGB テクスチャ。 デフューズ テクスチャに適したより低い画質形式。 ピクセルあたり 2 ビット (256x256 テクスチャの場合は、16 KB)。
RGBA Compressed PVRTC 2 bits圧縮された RGBA テクスチャ。 デフューズおよびスペキュラ コントロール テクスチャに適したより低い画質形式。 ピクセルあたり 2 ビット (256x256 テクスチャの場合は、16 KB)。
RGB Compressed DXT1圧縮された RGB テクスチャ。 この形式は iOS ではサポートされていませんが、デスクトップとの下位互換性に対して維持されます。
RGBA Compressed DXT5圧縮された RGBA テクスチャ。 この形式は iOS ではサポートされていませんが、デスクトップとの下位互換性に対して維持されます。
RGB 16 bitアルファなしの 65,000 色。 PVRTC 形式よりも多くのメモリを使用しますが、UI または階調度のないクリスプ テクスチャにより適している場合があります。 256x256 テクスチャの場合は、128 KB。
RGB 24 bitアルファなしの TrueColor。 256x256 テクスチャの場合は、192 KB。
Alpha 8 bit色なしの高画質アルファ チャンネル。 256x256 テクスチャの場合は、64 KB。
RGBA 16 bit低画質 True Color。 16 段階の赤、緑、青、アルファがあります。 PVRTC 形式よりも多くのメモリを使用しますが、正確なアルファ チャンネルが必要な場合に便利な場合があります。 256x256 テクスチャの場合は、128 KB。
RGBA 32 bitアルファのある Truecolor。最高画質になります。 256x256 テクスチャの場合は、256 KBで、費用がかかります。 ほとんどの場合、PVRTCは、はるかに小さいサイズで十分な画質を提供します。
Compression qualityFastで高パフォーマンス、Bestで高画質、Normalでふたつのバランスをとります

Android

Texture Formatテクスチャに対して使用される内部表示。 サイズと画質間でのトレードオフとなります。 下記の例では、256 x 256 ピクセルのゲーム内テクスチャの最終サイズを示しています。
RGB Compressed DXT1圧縮された RGB テクスチャ。 Nvidia Tegra でサポートされています。 ピクセルあたり 4 ビット (256x256 テクスチャの場合は、32 KB)。
RGBA Compressed DXT5圧縮された RGBA テクスチャ。 Nvidia Tegra でサポートされています。 ピクセルあたり 6 ビット (256x256 テクスチャの場合は、64 KB)。
RGB Compressed ETC 4 bits圧縮された RGB テクスチャ。 これは、Android プロジェクトのデフォルトのテクスチャ形式になります。 ETC1 は、OpenGL ES 2.0 の一部で、すべての OpenGL ES 2.0 GPU でサポートされています。 アルファはサポートしていません。 ピクセルあたり 4 ビット (256x256 テクスチャの場合は、32 KB)。
RGB Compressed PVRTC 2 bits圧縮された RGB テクスチャ。 Imagination PowerVR GPU でサポートされています。 ピクセルあたり 2 ビット (256x256 テクスチャの場合は、16 KB)。
RGBA Compressed PVRTC 2 bits圧縮された RGBA テクスチャ。 Imagination PowerVR GPU でサポートされています。 ピクセルあたり 2 ビット (256x256 テクスチャの場合は、16 KB)。
RGB Compressed PVRTC 4 bits圧縮された RGB テクスチャ。 Imagination PowerVR GPU でサポートされています。 ピクセルあたり 4 ビット (256x256 テクスチャの場合は、32 KB)。
RGBA Compressed PVRTC 4 bits圧縮された RGBA テクスチャ。 Imagination PowerVR GPU でサポートされています。 ピクセルあたり 4 ビット (256x256 テクスチャの場合は、32 KB)。
RGB Compressed ATC 4 bits圧縮された RGB テクスチャ。 Qualcomm Snapdragon でサポートされています。 ピクセルあたり 4 ビット (256x256 テクスチャの場合は、32 KB)。
RGBA Compressed ATC 8 bits圧縮された RGBA テクスチャ。 Qualcomm Snapdragon でサポートされています。 ピクセルあたり 6 ビット (256x256 テクスチャの場合は、64 KB)。
RGB 16 bitアルファなしの 65,000 色。 圧縮形式よりも多くのメモリを使用しますが、UI または階調度のないクリスプ テクスチャにより適している場合があります。 256x256 テクスチャの場合は、128 KB。
RGB 24 bitアルファなしの TrueColor。 256x256 テクスチャの場合は、192 KB。
Alpha 8 bit色なしの高画質アルファ チャンネル。 256x256 テクスチャの場合は、64 KB。
RGBA 16 bit低画質 True Color。 アルファ チャンネルのあるテクスチャに対するデフォルトの圧縮。 256x256 テクスチャの場合は、128 KB。
RGBA 32 bitアルファのある Truecolor。アルファのあるテクスチャに対する最高画質圧縮になります。 256x256 テクスチャの場合は、256 KB。 |
Compression qualityFastで高パフォーマンス、Bestで高画質、Normalでふたつのバランスをとります

Tegra など特定のハードウェアを対象としていない場合、ETC1 圧縮の使用をお勧めします。 必要な場合、外部のアルファ チャンネルを格納し、より低いテクスチャ フットプリントからメリットが得られます。 テクスチャにアルファ チャンネルを本当に格納したい場合、RGBA16 ビットは、すべてのハードウェア ベンダーが対応している圧縮方法になります。

アプリケーションがサポートされていないテクスチャ圧縮を使用する場合、テクスチャは、RGBA 32 に解凍され、圧縮テクスチャと共にメモリに格納されます。 この場合、テクスチャの解凍に無駄な時間を使い、2 回格納することでメモリも無駄になります。 これはまた、レンダリング パフォーマンスに大きな悪影響を及ぼす場合があります。

Flash

FormatImage format
RGB JPG CompressedRGB image data compressed in JPG format
RGBA JPG CompressedRGBA image data (ie, with alpha) compressed in JPG format
RGB 24-bitUncompressed RGB image data, 8 bits per channel
RGBA 32-bitUncompressed RGBA image data, 8 bits per channel

詳細

対応形式

Unity は、次の画像ファイル形式をサポートしています。 PSD、TIFF、JPG、TGA、PNG、GIF、BMP、IFF、PICT。 Unity はマルチ レイヤー PSD & TIFF ファイルを適切にインポートできます。 これらはインポート時に自動的に平坦化されますが、レイヤーは、それ自体アセットに維持されるため。これらのファイルタイプをネイティブに使用する際も作業が無駄になることはありません。 これは、Photoshop から使用できるテクスチャのコピーの 1 つを 3D モデリング アプリケーションから Unity に作成できるので重要です。

テクスチャ サイズ

理想的には、テクスチャは両側が 2 のべき乗になります。 これらのサイズは次のようになります。 2、4、8、16、32、64、128、256、512、1024 または 2048 ピクセル。 テクスチャは正方形である必要はありません。つまる、幅と高さは異なっていても構いません。

Unity では別のテクスチャ サイズ (2 のべき乗以外) を使用することができます。 2 のべき乗以外のテクスチャ サイズは、GUI Textures で使用されるのがベストですが、他で使用される場合は、非圧縮の RGBA 32 ビット形式に変換されます。 つまり、このテクスチャ サイズは、ビデオ メモリ (PVRT (iOS)/DXT (デスクトップ) 圧縮テクスチャ) を使用するため、ロードやレンダリングにより時間がかかります (iOS モード時)。 一般に、2 のべき乗以外のサイズは GUI 目的にのみ使用します。

2 のべき乗以外のテクスチャ アセット、インポート設定に詳細テクスチャ タイプの「Non Power of 2」オプションを使用して、インポート時に拡大できます。 Unity は、要求に応じて、テクスチャの内容を縮小拡大し、ゲーム内では、このテクスチャの内容は他のテクスチャ同様動作するため、圧縮でき、非常に高速でロードされます。

2 のべき乗以外のテクスチャの潜在的な問題として、Unityが内部処理で2のべき乗のテクスチャに変換し、ストレッチ処理がわずかな画像の乱れを引き起こすことがあります。

UV マッピング

3D モデルに 2D テクスチャをマッピングすると、ある種のラッピングが行われます。 これは、UV mapping と呼ばれ、3D モデリング アプリケーションで行われます。 Unity 内で、Materials を使用して、テクスチャをスケールおよび移動させることができます。 法線および詳細マップのスケーリングは特に便利です。

ミップ マップ

ミップ マップは、徐々に縮小していく画像で、リアルタイムの 3D エンジンでのパフォーマンを最適化するのに使用されます。 カメラから遠くにあるオブジェクトは、より小さいテクスチャを使用します。 ミップ マップを使用することで、33% のメモリしか使用しませんが、使用しないと、大きなパフォーマンス損失が生じます。 必ずゲーム内のテクスチャにミップ マップを使用した方がよいでしょう。小型化されないテクスチャの場合が唯一の例外です。

法線マップ

法線マップは、ポリゴンの少ないモデルをより多くの細部を持っているように見せる場合に法線マップ シェーダがあるように見せる場合にに使用されます。 Unity は、RGB 画像として符号化された法線マップを使用します。 グレースケールの高さマップ画像から法線マップを生成するオプションもあります。

詳細マップ

地形を作成したい場合は、通常、メイン テクスチャを使用して、くさや岩、砂などのエリアがどこにあるかを示します。 地形が適切なサイズの場合、最終的にぼやけてしまいます。 メイン テクスチャが近づくに連れ、Detail textures は、細かい細部をフェードインすることでこの事実を隠します。

詳細テクスチャを描画時に、ニュートラルのグレーが非表示になり、白がメイン テクスチャを 2 倍明るく、黒がメイン テクスチャを完全な黒に見せます。

反射 (キューブ マップ)

反射マップにテクスチャを使用したい場合 (例:「Reflective」組み込みシェーダを使用)、Cubemap Textures を使用する必要があります。

異方性フィルタリング

異方性フィルタリングは、グレージング角から表示された時に、レンダリング費用をある程度犠牲にして画質を向上します (この費用は全体的にグラフィック カードに依存します)。 異方性レベルを上げるのは通常、地面および床テクスチャにとってよいアイディアです。 Quality Settings では、異方性フィルタリングは、すべてのテクスチャに強制的に実行できるか、全体的に無効にできます。


「地面テクスチャに使用される非異方性 (左)/最大異方性(右)」

Page last updated: 2007-11-16



Procedural Materials

Unity incorporates a new asset type known as Procedural Materials. These are essentially the same as standard Materials except that the textures they use can be generated at runtime rather than being predefined and stored.

The script code that generates a texture procedurally will typically take up much less space in storage and transmission than a bitmap image and so Procedural Materials can help reduce download times. Additionally, the generation script can be equipped with parameters that can be changed in order to vary the visual properties of the material at runtime. These properties can be anything from color variations to the size of bricks in a wall. Not only does this mean that many variations can be generated from a single Procedural Material but also that the material can be animated on a frame-by-frame basis. Many interesting visual effects are possible - imagine a character gradually turning to stone or acid damaging a surface as it touches.

Unity's Procedural Material system is based around an industry standard product called Substance, developed by Allegorithmic

Supported Platforms

In Unity, Procedural Materials are fully supported for standalone and webplayer build targets only (Windows and Mac OS X). For all other platforms, Unity will pre-render or bake them into ordinary Materials during the build. Although this clearly negates the runtime benefits of procedural generation, it is still useful to be able to create variations on a basic material in the editor.

Adding Procedural Materials to a Project

A Procedural Material is supplied as a Substance Archive file (SBSAR) which you can import like any other asset (drag and drop directly onto the Assets folder or use Assets->Import New Asset...). A Substance Archive asset contains one or more Procedural Materials and contains all the scripts and images required by these. Uncompiled SBS files are not supported.

Although they are implemented differently, Unity handles a Procedural Material just like any other Material. To assign a Procedural Material to a mesh, for example, you just drag and drop it onto the mesh exactly as you would with any other Material.

Procedural Properties

Each Procedural Material is a custom script which generates a particular type of material. These scripts are similar to Unity scripts in that they can have variables exposed for assignment in the inspector. For example, a "Brick Wall" Procedural Material could expose properties that let you set the number of courses of bricks, the colors of the bricks and the color of the mortar. This potentially offers infinite material variations from a single asset. These properties can also be set from a script at runtime in much the same way as the public variables of a MonoBehaviour script.

Procedural Materials can also incorporate complex texture animation. For example, you could animate the hands of the clock or cockroaches running across a floor.

Creating Procedural Materials From Scratch

Procedural Materials can work with any combination of procedurally generated textures and stored bitmaps. Additionally, included bitmap images can be filtered and modified before use. Unlike a standard Material, a Procedural Material can use vector images in the form of SVG files which allows for resolution-independent textures.

The design tools available for creating Procedural Materials from scratch use visual, node-based editing similar to the kind found in artistic tools. This makes creation accessible to artists who may have little or no coding experience. As an example, here is a screenshot from Allegorithmic's Substance Designer which shows a "brick wall" Procedural Material under construction:

Obtaining Procedural Materials

Since Unity's Procedural Materials are based on the industry standard Substance product, Procedural Material assets are readily available from internet sources, including Unity's own Asset Store. Allegorithmic's Substance Designer can be used to create Procedural Materials, but there are other applications (3D modelling apps, for example) that incorporate the Substance technology and work just as well with Unity.

Performance and Optimization

Procedural Materials inherently tend to use less storage than bitmap images. However, the trade-off is that they are based around scripts and running those scripts to generate materials requires some CPU and GPU resources. The more complex your Procedural Materials are, the greater their runtime overhead.

Procedural Materials support a form of caching whereby the material is only updated if its parameters have changed since it was last generated. Further to this, some materials may have many properties that could theoretically be changed and yet only a few will ever need to change at runtime. In such cases, you can inform Unity about the variables that will not change to help it cache as much data as possible from the previous generation of the material. This will often improve performance significantly.

Procedural Materials can refer to hidden, system-wide, variables, such as elapsed time or number of Procedural Material instances (this data can be useful for animations). Changes in the values of these variables can still force a Procedural Material to update even if none of the explicitly defined parameters change.

Procedural Materials can also be used purely as a convenience in the editor (ie, you can generate a standard Material by setting the parameters of a Procedural Material and then "baking" it). This will remove the runtime overhead of material generation but naturally, the baked materials can't be changed or animated during gameplay.

Using the Substance Player to Analyze Performance

Since the complexity of a Procedural Material can affect runtime performance, Allegorithmic incorporates profiling features in its Substance Player tool. This tool is available to download for free from Allegorithmic's website.

Substance Player uses the same optimized rendering engine as the one integrated into Unity, so its rendering measurement is more representative of performance in Unity than that of Substance Designer.

Page last updated: 2012-10-12



Video Files

注意: Unity Pro/ Advancedのみ

デスクトップ!

Movie Texturesは、ビデオファイルから作成され、アニメーション化されたTextureです。

プロジェクトのAssetsフォルダに動画ファイルを配置することによって、通常使用するTexture とまったく同じように使用できるビデオをインポートすることができます。

動画ファイルはApple社のQuickTimeを介してインポートされます。サポートされるファイルの種類はインストールされたQuickTimeのがサポートするものと一致します(通常は .mov.mpg.mpeg.mp4.avi.asf)Windows上でムービーがインポートされるためにはQuickTimeがインストールされていることが必要です((ここ ) からダウンロード)

プロパティ

Movie Textures Inspectorは、通常のTexture Inspectorと非常によく似ています。


Unity上でビデオファイルから生成したMovie Textures
Aniso Level急な角度から眺めたときのTexture品質を向上させます。床や地面のTextureに良い。
Filter ModeTextureが3Dへの変換によりストレッチされたときのフィルタ処理を選択
Loopオンの時、ムービー再生終了後にループ
QualityOgg Theoraビデオファイルの圧縮。より高い値により、品質がより高い一方でファイルサイズは大きくなる。

詳細

ビデオファイルがプロジェクトに追加されると、自動的にインポートされ、Ogg Theora形式に変換されます。一度Movie Texturesがインポートすると、通常のTextureのように、任意のGameObjectまたはMaterialにアタッチできます。

ムービーを再生

ゲームの実行開始時に、Movie Texturesは自動再生されません。再生を指示するスクリプトを準備する必要があります。

// このコードによりMovie Texturesが再生されます
renderer.material.mainTexture.Play();

スペースが押されたときに動画再生をプレイバックに切り替えるためには、次のスクリプトをアタッチします。

function Update () {
	if (Input.GetButtonDown ("Jump")) {
		if (renderer.material.mainTexture.isPlaying) {
			renderer.material.mainTexture.Pause();
		}
		else {
			renderer.material.mainTexture.Play();
		}
	}
}

Movie Texturesを再生する方法の詳細については、Movie Textures スクリプトリファレンス を参照してください。

ムービーオーディオ

Movie Texturesをインポートすると、映像とともにオーディオトラックもインポートされます。オーディオは、Movie TexturesのAudioClipを子オブジェクトとして表示されます。


ビデオのオーディオトラックはProject ViewにてMovie Texturesの子オブジェクトとして表示されます

このオーディオを再生するには、他のオーディオクリップのように、ゲームオブジェクトにアタッチする必要があります。Project Viewから、シーンビューか階層ビューの任意のゲームオブジェクトへドラッグします。

通常、ムービーを見せているのと同じゲームオブジェクトになります。次に audio.Play() を使用し、映像に合わせてオーディオトラックを再生します。

iOS

Movie Texturesは、iOS上ではサポートされません。代わりに、Handheld.PlayFullScreenMovie を使用してフルスクリーン ストリーミング再生が提供される。

プロジェクト ディレクトリのStreamingAssetsフォルダ内にビデオを格納する必要があります。

iOSデバイス上で正しく再生できるファイルタイプはUnityのiOSでサポートされるため、(.mov, .mp4, .mpv, and .3gp )の拡張子や、次の圧縮規格はサポートされます:

  • H.264 ベースライン プロファイルレベル3.0 ビデオ
  • MPEG-4 Part 2 ビデオ

サポートされている圧縮規格の詳細については、iPhone SDKを参照してください。 MPMoviePlayerController クラスリファレンス

iPhoneUtils.PlayMovie あるいは iPhoneUtils.PlayMovieURL をコールすると画面は現在のコンテンツから、指定された背景色にフェードアウトします。ムービーが再生できる状態になるまでに、少し時間がかかるかもしれませんが、その間プレイヤーは背景色が表示され続けるとともにムービーのロード時間の進行状況インジケータを表示することもできます。再生が終了すると、画面は元のコンテンツに戻るためにフェードバックします。

ビデオプレーヤーは、ビデオ再生時のミュート切替は無視します

すでに書いたように、ビデオファイルはAppleの埋め込みプレーヤーを使用して再生されます。(SDK 3.2およびiPhone OS 3.1.2およびそれ以前のバージョン)このプレーヤーにはバグが含まれており、Unityではミュートに切替えることが出来ません。

ビデオプレーヤーは、デバイスの向きを無視します

アップル社ビデオプレーヤーとiPhone SDKはビデオの向きを調整する方法を提供していません。一般的なアプローチは、手動で各ムービーの複製を2つ、ランドスケープやポートレートの向きで、作成することです。これにより、デバイスの向きをプレイバック前に判定することで、正しいムービーを選択して再生することができる。

Android

Movie Texturesは、Android上ではサポートされません。代わりに、Handheld.PlayFullScreenMovie を使用してフルスクリーン ストリーミング再生が提供される。

プロジェクト ディレクトリのStreamingAssetsフォルダ内にビデオを格納する必要があります。

Androidデバイス上で正しく再生できるファイルタイプはUnityのAndroidでサポートされるため、(.mp4, and .3gp )の拡張子や、次の圧縮規格はサポートされます:

  • H.263
  • H.264 AVC
  • MPEG-4 SP

ただし、デバイスベンダーによりこのリストのサポート範囲は拡大しており、Androidの再生フォーマットが再生できるようなっているたえm、いくつかのAndroid端末はHD動画など、のフォーマットを再生することができます。

サポートされている圧縮規格の詳細については、Android SDKを参照してくださいコアメディアフォーマットのドキュメント

iPhoneUtils.PlayMovie あるいは iPhoneUtils.PlayMovieURL をコールすると画面は現在のコンテンツから、指定された背景色にフェードアウトします。ムービーが再生できる状態になるまでに、少し時間がかかるかもしれませんが、その間プレイヤーは背景色が表示され続けるとともにムービーのロード時間の進行状況インジケータを表示することもできます。再生が終了すると、画面は元のコンテンツに戻るためにフェードバックします。

Page last updated: 2007-11-16



Audio Files

As with Meshes or Textures, the workflow for Audio File assets is designed to be smooth and trouble free. Unity can import almost every common file format but there are a few details that are useful to be aware of when working with Audio Files.

Audio in Unity is either Native or Compressed. Unity supports most common formats (see the list below) and will import an audio file when it is added to the project. The default mode is Native, where the audio data from the original file is imported unchanged. However, Unity can also compress the audio data on import, simply by enabling the Compressed option in the importer. (iOS projects can make use of the hardware decoder - see the iOS documentation for further details). The difference between Native and Compressed modes are as follows:-

Any Audio File imported into Unity is available from scripts as an Audio Clip instance, which is effectively just a container for the audio data. The clips must be used in conjunction with Audio Sources and an Audio Listener in order to actually generate sound. When you attach your clip to an object in the game, it adds an Audio Source component to the object, which has Volume, Pitch and a numerous other properties. While a Source is playing, an Audio Listener can "hear" all sources within range, and the combination of those sources gives the sound that will actually be heard through the speakers. There can be only one Audio Listener in your scene, and this is usually attached to the Main Camera.

Supported Formats

FormatCompressed as (Mac/PC)Compressed as (Mobile)
MPEG(1/2/3)Ogg VorbisMP3
Ogg VorbisOgg VorbisMP3
WAVOgg VorbisMP3
AIFFOgg VorbisMP3
MOD--
IT--
S3M--
XM--

See the Sound chapter in the Creating Gameplay section of this manual for more information on using sound in Unity.

オーディオ クリップ

Audio Clip は、Audio Source によって使用されるオーディオ データです。 Unity は、モノ、ステレオおよびマルチ チャンネル (8 つまで) のオーディオ アセットをサポートしています。 Unity は、次のオーディオ ファイル形式をサポートしています。 .aif.wav.mp3.oggおよび次の トラッカー モジュール ファイル形式: .xm.mod.itおよび .s3m 。 トラッカー モジュール アセットは、波形プレビューをアセット インポート インスペクタにレンダリングできないこと以外は、Unity のその他のオーディオ アセットと同じ働きをします。


「オーディオ クリップ Inspector

プロパティ

Audio Formatランタイム時に音声に使用される特定の形式。
Nativeファイル サイズが大きくなるにつれ、品質が高くなります。 非常に短い音響効果に最適です。
Compressedファイル サイズが小さくなるにつれ、品質が低くなるか、変わりやすくなります。 中程度の長さの音響効果や音楽に最適です。
3D Sound有効にすると、3D スペースで音声が再生されます。 モノとステレオの音声の両方を 3D で再生できます。
Force to mono有効にすると、オーディオ クリップが 1 つのチャンネル音声にダウンミックスされます。
Load TypeUnity がランタイムで音声をロードする方法。
Decompress on loadロード時に音声を解凍します。 オン ザ フライの解凍の性能オーバーヘッドを回避するため、より小さい圧縮音声に使用します。 ロード時の音声の解凍では、メモリ内で圧縮状態を維持する場合の 10 倍以上のメモリを使用するため、大きなファイルには使用しないでください。
Compressed in memoryメモリ内で圧縮状態を維持し、再生時には解凍します。 若干の性能オーバーヘッドが生じるため (Ogg/Vorbis 圧縮ファイルの esp.)、大きいファイルにのみ使用してください。技術的な制約により、このオプションはFMODオーディオを使用するプラットフォーム上でOgg Vorbisについて”Steam From Disc”(下記参照)に切り換わることに注意してください。
Stream from discディスクから直接オーディオ データを流します。これは、メモリの元の音声サイズの一部を使用します。 音楽や非常に長いトラックに使用してください。 一般的に、ハードウェアに応じて、1 ~ 2 の同時ストリームに抑えてください。
Compression「圧縮」クリップに適用される圧縮の量。 ファイル サイズに関する統計はスライダの下で確認できます。 スライダをドラッグして、再生を「十分良好」な状態にすべきですが、ファイルや配布上のニーズに見合うよう、十分小さいサイズにしてください。
Hardware Decoding(iOS のみ) iOS 機器上の圧縮オーディオに使用できます。 解凍時の CPU への負担を減らすため、Apple のハードウェア デコーダを使用します。 詳細については、プラットフォーム固有の詳細を確認してください。
Gapless looping(Android/iOS のみ) 完全ループのオーディオ ソース ファイル (非圧縮 PCM 形式) を圧縮する際に、そのループを残すために使用します。 標準の MPEG エンコーダは、ループ点周辺にサイレンスを取り込んでいますが、これはちょっとした「クリック」または「ポップ」として再生します。 Unity ではこれは円滑に扱われます。

オーディオ アセットのインポート

Unity は「圧縮」と「ネイティブ」オーディオの両方をサポートしています。 どのファイルも (MP3/Ogg Vorbis を除く) 最初は「ネイティブ」としてインポートされます。 ゲーム稼働中、圧縮オーディオ ファイルは CPU によって解凍される必要がありますが、ファイル サイズは小さくなります。 「Stream」にチェックを入れると、オーディオは「オン ザ フライ」で解凍されるか、そうでない場合は、オーディオはロード時に全体的に解凍されます。 ネイティブの PCM 形式 (WAV、AIFF) には CPU への負担を増やすことなく、高い忠実性があるという利点がありますが、作成されるファイルのサイズははるかに大きくなります。 モジュール ファイル (.mod、.it、.s3m..xm) は、極めて低いフットプリントで非常に高い音質を提供できます。

一般的に、「圧縮」オーディオ (またはモジュール) は、BGM や会話などの長いファイルに最適で、非圧縮オーディオは、短い音響効果により適しています。 高圧縮から始めて、圧縮スライダで圧縮の量を弱め、音質の差が著しくなる前後で適切に微調整します。

3D オーディオの使用

オーディオ クリップに「3D 音声」と表示されている場合、このクリップは、ゲームの世界の 3D スペースでの位置をシミュレートするために再生されます。 3D 音声は、音量を減らし、スピーカー間でパンすることで、音声の距離や位置をエミュレートします。 モノとマルチ チャンネルの音声の両方を 3D に配置できます。 マルチ チャンネル オーディオの場合、Audio Source の「Spread」オプションを使用して、スピーカー スペースで個々のチャンネルを拡散および分割します。 Unity は、3D スペースでのオーディオ の動作を制御および微調整するための各種オプションを提供しています。 Audio Source を参照してください。

プラットフォーム固有の詳細

iOS

携帯プラットフォーム上では、解凍時の CPU の負担を減らするため、圧縮オーディオは MP3 として符号化されます。

パフォーマンス上の理由から、オーディオ クリップは、Apple ハードウェア コーデックを使用して再生できます。 これを有効にするには、オーディオ インポータの「ハードウェア デコーディング」チェックボックスにチェックを入れます。 バックグラウンドの iPod オーディオを含む、ハードウェア オーディオ ストリームは 1 回につき、1 つしか回答できません。

ハードウェア デコーダを使用できない場合は、解凍はソフトウェア デコーダで行われます (iPhone 3GS 以降では、Apple のソフトウェア デコーダが Unity(FMOD) 自身のデコーダ上で使用されます)。

Android

携帯プラットフォーム上では、解凍時の CPU の負担を減らするため、圧縮オーディオは MP3 として符号化されます。

Page last updated: 2012-08-03



TrackerModules

Tracker Modules are essentially just packages of audio samples that have been modeled, arranged and sequenced programatically. The concept was introduced in the 1980's (mainly in conjunction with the Amiga computer) and has been popular since the early days of game development and demo culture.

Tracker Module files are similar to MIDI files in many ways. The tracks are scores that contain information about when to play the instruments, and at what pitch and volume and from this, the melody and rhythm of the original tune can be recreated. However, MIDI has a disadvantage in that the sounds are dependent on the sound bank available in the audio hardware, so MIDI music can sound different on different computers. In contrast, tracker modules include high quality PCM samples that ensure a similar experience regardless of the audio hardware in use.

Supported formats

Unity supports the four most common module file formats, namely Impulse Tracker (.it), Scream Tracker (.s3m), Extended Module File Format (.xm), and the original Module File Format (.mod).

Benefits of Using Tracker Modules

Tracker module files differ from mainstream PCM formats (.aif, .wav, .mp3, and .ogg) in that they can be very small without a corresponding loss of sound quality. A single sound sample can be modified in pitch and volume (and can have other effects applied), so it essentially acts as an "instrument" which can play a tune without the overhead of recording the whole tune as a sample. As a result, tracker modules lend themselves to games, where music is required but where a large file download would be a problem.

Third Party Tools and Further References

Currently, the most popular tools to create and edit Tracker Modules are MilkyTracker for OSX and OpenMPT for Windows. For more information and discussion, please see the blog post .mod in Unity from June 2010.

Page last updated: 2011-11-15



Scripting

This brief introduction explains how to create and use scripts in a project. For detailed information about the Scripting API, please view the Scripting Reference. For detailed information about creating game play through scripting, please view the Creating Gameplay page of this manual.

Behaviour scripts in Unity can be written in JavaScript, C#, or Boo. It is possible to use any combination of the three languages in a single project, although there are certain restrictions in cases where one script incorporates classes defined in another script.

Creating New Scripts

Unlike other assets like Meshes or Textures, Script files can be created from within Unity. To create a new script, open the Assets->Create->JavaScript (or Assets->Create->C Sharp Script or Assets->Create->Boo Script) from the main menu. This will create a new script called NewBehaviourScript and place it in the selected folder in Project View. If no folder is selected in Project View, the script will be created at the root level.

You can edit the script by double-clicking on it in the Project View. This will launch your default text editor as specified in Unity's preferences. To set the default script editor, change the drop-down item in Unity->Preferences->External Script editor.

These are the contents of a new, empty behaviour script:

function Update () {
} 

A new, empty script does not do a lot on its own, so let's add some functionality. Change the script to read the following:

function Update () {
    print("Hello World");
} 

When executed, this code will print "Hello World" to the console. But there is nothing that causes the code to be executed yet. We have to attach the script to an active GameObject in the Scene before it will be executed.

Attaching scripts to objects

Save the above script and create a new object in the Scene by selecting GameObject->Create Other->Cube. This will create a new GameObject called "Cube" in the current Scene.

Now drag the script from the Project View to the Cube (in the Scene or Hierarchy View, it doesn't matter). You can also select the Cube and choose Component->Scripts->New Behaviour Script. Either of these methods will attach the script to the Cube. Every script you create will appear in the Component->Scripts menu.

If you select the Cube and look at the Inspector, you will see that the script is now visible. This means it has been attached.

Press Play to test your creation. You should see the text "Hello World" appear beside the Play/Pause/Step buttons. Exit play mode when you see it.

Manipulating the GameObject

A print() statement can be very handy when debugging your script, but it does not manipulate the GameObject it is attached to. Let's change the script to add some functionality:

function Update () {
    transform.Rotate(0, 5*Time.deltaTime, 0);
} 

If you're new to scripting, it's okay if this looks confusing. These are the important concepts to understand:

  1. function Update () {} is a container for code that Unity executes multiple times per second (once per frame).
  2. transform is a reference to the GameObject's Transform Component.
  3. Rotate() is a function contained in the Transform Component.
  4. The numbers in-between the commas represent the degrees of rotation around each axis of 3D space: X, Y, and Z.
  5. Time.deltaTime is a member of the Time class that evens out movement over one second, so the cube will rotate at the same speed no matter how many frames per second your machine is rendering. Therefore, 5 * Time.deltaTime means 5 degrees per second.

With all this in mind, we can read this code as "every frame, rotate this GameObject's Transform component a small amount so that it will equal five degrees around the Y axis each second."

You can access lots of different Components the same way as we accessed transform already. You have to add Components to the GameObject using the Component menu. All the Components you can access directly are listed under Variables on the GameObject Scripting Reference Page.

For more information about the relationship between GameObjects, Scripts, and Components, please jump ahead to the GameObjects page or Using Components page of this manual.

The Power of Variables

Our script so far will always rotate the Cube 5 degrees each second. We might want it to rotate a different number of degrees per second. We could change the number and save, but then we have to wait for the script to be recompiled and we have to enter Play mode before we see the results. There is a much faster way to do it. We can experiment with the speed of rotation in real-time during Play mode, and it's easy to do.

Instead of typing 5 into the Rotate() function, we will declare a speed variable and use that in the function. Change the script to the following code and save it:

var speed = 5.0;

function Update () {
    transform.Rotate(0, speed*Time.deltaTime, 0);
}

Now, select the Cube and look at the Inspector. Notice how our speed variable appears.

This variable can now be modified directly in the Inspector. Select it, press Return and change the value. You can also right- or option-click on the value and drag the mouse up or down. You can change the variable at any time, even while the game is running.

Hit Play and try modifying the speed value. The Cube's rotation speed will change instantly. When you exit Play mode, you'll see that your changes are reverted back to their value before entering Play mode. This way you can play, adjust, and experiment to find the best value, then apply that value permanently.

The technique of changing a variable's value in the Inspector makes it easy to reuse one script on many objects, each with a different variable value. If you attach the script to multiple Cubes, and change the speed of each cube, they will all rotate at different speeds even though they use the same script.

Accessing Other Components

When writing a script Component, you can access other components on the GameObject from within that script.

Using the GameObject members

You can directly access any member of the GameObject class. You can see a list of all the GameObject class members here. If any of the indicated classes are attached to the GameObject as a Component, you can access that Component directly through the script by simply typing the member name. For example, typing transform is equivalent to gameObject.transform. The gameObject is assumed by the compiler, unless you specifically reference a different GameObject.

Typing this will be accessing the script Component that you are writing. Typing this.gameObject is referring to the GameObject that the script is attached to. You can access the same GameObject by simply typing gameObject. Logically, typing this.transform is the same as typing transform. If you want to access a Component that is not included as a GameObject member, you have to use gameObject.GetComponent() which is explained on the next page.

There are many Components that can be directly accessed in any script. For example, if you want to access the Translate function of the Transform component, you can just write transform.Translate() or gameObject.transform.Translate(). This works because all scripts are attached to a GameObject. So when you write transform you are implicitly accessing the Transform Component of the GameObject that is being scripted. To be explicit, you write gameObject.transform. There is no advantage in one method over the other, it's all a matter of preference for the scripter.

To see a list of all the Components you can access implicitly, take a look at the GameObject page in the Scripting Reference.

Using GetComponent()

There are many Components which are not referenced directly as members of the GameObject class. So you cannot access them implicitly, you have to access them explicitly. You do this by calling the GetComponent("component name") and storing a reference to the result. This is most common when you want to make a reference to another script attached to the GameObject.

Pretend you are writing Script B and you want to make a reference to Script A, which is attached to the same GameObject. You would have to use GetComponent() to make this reference. In Script B, you would simply write:

scriptA = GetComponent("ScriptA");

For more help with using GetComponent(), take a look at the GetComponent() Script Reference page.

Accessing variables in other script Components

All scripts attached to your GameObjects are Components. Therefore to get access to a public variable (and methods) in a script you make use of the GetComponent method. For example:

function Start () {
   // Print the position of the transform component, for the gameObject this script is attached to
   Debug.Log(gameObject.GetComponent<Transform>.().position);
}

In the previous example the GetComponent<T>. function is used to access the position property of the Transform component. The same technique can be used to access a variable in a custom script Component:

(MyClass.js)
public var speed : float = 3.14159;

(MyOtherClass.js)
function Start () {
   // Print the speed variable from the MyClass script Component attached to the gameObject
   Debug.Log(gameObject.GetComponent<MyClass>.().speed);
}

Accessing a variable defined in C# from Javascript

To access variables defined in C# scripts the compiled Assembly containing the C# code must exist when the Javascript code is compiled. Unity performs the compilation in different stages as described in the Script Compilation section in the Scripting Reference. If you want to create a Javascript that uses classes or variables from a C# script just place the C# script in the "Standard Assets", "Pro Standard Assets" or "Plugins" folder and the Javascript outside of these folders. The code inside the "Standard Assets", "Pro Standard Assets" or "Plugins" is compiled first and the code outside is compiled in a later step making the Types defined in the compilation step (your C# script) available to later compilation steps (your Javascript script).

In general the code inside the "Standard Assets", "Pro Standard Assets" or "Plugins" folders, regardless of the language (C#, Javascript or Boo), will be compiled first and available to scripts in subsequent compilation steps.

Optimizing variable access

In some circumstances you may be using GetComponent multiple times in your code, or multiple times per frame. Every call to GetComponent does a few extra steps internally to get the reference to the component you require. A more efficient approach is to store the reference to the component for example in your Start() function. As you will be storing the reference and not retrieving directly it is always good practice to check for null references:

(MyClass.js)
public var speed : float = 3.14159;

(MyOtherClass.js)
private var myClass : MyClass;
function Start () {
   // Get a reference to the MyClass script Component attached to the gameObject
   myClass = gameObject.GetComponent<MyClass>.();
}
function Update () {
   // Verify that the reference is still valid and print the speed variable
   if(myClass != null)
      Debug.Log (myClass.speed);
}

Static Variables

It is also possible to declare variables in your classes as static. There will exist one and only one instance of a static variable for a specific class and it can be modified without the need of an instance of a class object:

(MyClass.js)
static public var speed : float = 3.14159;

(MyOtherClass.js)
function Start () {
   Debug.Log (MyClass.speed);
}

It is recommended to not use static variables for object references to make sure unused objects are removed from memory.

Where to go from here

This was just a short introduction on how to use scripts inside the Editor. For more examples, check out the Unity tutorials, available for free on our Asset Store. You should also read through the Scripting Overview in the Script Reference, which contains a more thorough introduction into scripting with Unity along with pointers to more in-depth information. If you're really stuck, be sure to visit the Unity Answers or Unity Forums and ask questions there. Someone is always willing to help.

Page last updated: 2012-06-28



Asset Store

Unity's Asset Store is home to a growing library of free and commercial assets created both by Unity Technologies and also members of the community. A wide variety of assets is available, covering everything from textures, models and animations to whole project examples, tutorials and Editor extensions. The assets are accessed from a simple interface built into the Unity Editor and are downloaded and imported directly into your project.

Access and Navigation

You can open the Asset Store window by selecting Window->AssetStore from the main menu. On your first visit, you will be prompted to create a free user account which you will use to access the Store subsequently.


The Asset Store front page.

The Store provides a browser-like interface which allows you to navigate either by free text search or by browsing packages and categories. To the left of the main tool bar are the familiar browsing buttons for navigating through the history of viewed items:-

To the right of these are buttons for viewing the Download Manager and for viewing the current contents of your shopping cart.

The Download Manager allows you to view the packages you have already bought and also to find and install any updates. Additionally, the standard packages supplied with Unity can be viewed and added to your project with the same interface.


The Download Manager.

Location of Downloaded Asset Files

You will rarely, if ever, need to access the files downloaded from the Asset Store directly. However, if you do need to, you can find them in

  ~/Library/Unity/Asset Store

...on the Mac and in

  C:\Users\accountName\AppData\Roaming\Unity\Asset Store

...on Windows. These folders contain subfolders that correspond to particular Asset Store vendors - the actual asset files are contained in the appropriate subfolders.

Page last updated: 2011-12-09



Asset Server

Unity Asset Server Overview

The Unity Asset Server is an asset and version control system with a graphical user interface integrated into Unity. It is meant to be used by team members working together on a project on different computers either in-person or remotely. The Asset Server is highly optimized for handling large binary assets in order to cope with large multi gigabyte project folders. When uploading assets, Import Settings and other meta data about each asset is uploaded to the asset server as well. Renaming and moving files is at the core of the system and well supported.

It is available only for Unity Pro, and is an additional license per client. To purchase an Asset Server Client License, please visit the Unity store at http://unity3d.com/store

New to Source Control?

If you have never used Source Control before, it can be a little unfriendly to get started with any versioning system. Source Control works by storing an entire collection of all your assets - meshes, textures, materials, scripts, and everything else - in a database on some kind of server. That server might be your home computer, the same one that you use to run Unity. It might be a different computer in your local network. It might be a remote machine colocated in a different part of the world. It could even be a virtual machine. There are a lot of options, but the location of the server doesn't matter at all. The important thing is that you can access it somehow over your network, and that it stores your game data.

In a way, the Asset Server functions as a backup of your Project Folder. You do not directly manipulate the contents of the Asset Server while you are developing. You make changes to your Project locally, then when you are done, you Commit Changes to the Project on the Server. This makes your local Project and the Asset Server Project identical.

Now, when your fellow developers make a change, the Asset Server is identical to their Project, but not yours. To synchronize your local Project, you request to Update from Server. Now, whatever changes your team members have made will be downloaded from the server to your local Project.

This is the basic workflow for using the Asset Server. In addition to this basic functionality, the Asset Server allows for rollback to previous versions of assets, detailed file comparison, merging two different scripts, resolving conflicts, and recovering deleted assets.

Setting up the Asset Server

The Asset Server requires a one time server setup and a client configuration for each user. You can read about how to do that in the Asset Server Setup page.

The rest of this guide explains how to deploy, administrate, and regularly use the Asset Server.

Daily use of the Asset Server

This section explains the common tasks, workflow and best practices for using the Asset Server on a day-to-day basis.

Getting Started

If you are joining a team that has a lot of work stored on the Asset Server already, this is the quickest way to get up and running correctly. If you are starting your own project from scratch, you can skip down to the Workflow Fundamentals section.

  1. Create a new empty Project with no packages imported
  2. Go to Edit->Project Settings->Editor and select Asset Server as the version control mode
  3. From the menubar, select Window->Version
  4. Click the Connection button
  5. Enter your user name and password (provided by your Asset Server administrator)
  6. Click Show Projects and select the desired project
  7. Click Connect
  8. Click the Update tab
  9. Click the Update button
  10. If a conflict occurs, discard all local versions
  11. Wait for the update to complete
  12. You are ready to go

Continue reading for detailed information on how to use the Asset Server effectively every day.

Workflow Fundamentals

When using the Asset Server with a multi-person team, it is generally good practice to Update all changed assets from the server when you begin working, and Commit your changes at the end of the day, or whenever you're done working. You should also commit changes when you have made significant progress on something, even if it is in the middle of the day. Committing your changes regularly and frequently is recommended.

Understanding the Server View

The Server View is your window into the Asset Server you're connected to. You can open the Server View by selecting Window->Version Control.


The Overview tab

The Server View is broken into tabs: Overview Update, and Commit. Overview will show you any differences between your local project and the latest version on the server with options to quickly commit local changes or download the latest updates. Update will show you the latest remote changes on the server and allow you to download them to your local project. Commit allows you to create a Changeset and commit it to the server for others to download.

Connecting to the server

Before you can use the asset server, you must connect to it. To do this you click the Connection button, which takes you to the connection screen:


The Asset Server connection screen

Here you need to fill in:

  1. Server address
  2. Username
  3. Password

By clicking Show projects you can now see the available projects on the asset server, and choose which one to connect to by clicking Connect. Note that the username and password you use can be obtain from your system administrator. Your system administrator created accounts when they installed Asset Server.

Updating from the Server

To download all updates from the server, select the Update tab from the Overview tab and you will see a list of the latest committed Changesets. By selecting one of these you can see what was changed in the project as well as the provided commit message. Click Update and you will begin downloading all Changeset updates.


The Update Tab

Committing Changes to the Server

When you have made a change to your local project and you want to store those changes on the server, you use the top Commit tab.


The Commit tab

Now you will be able to see all the local changes made to the project since your last update, and will be able to select which changes you wish to upload to the server. You can add changes to the changeset either by manually dragging them into the changeset field, or by using the buttons placed below the commit message field. Remember to type in a commit message which will help you when you compare versions or revert to an earlier version later on, both of which are discussed below.

Resolving conflicts

With multiple people working on the same collection of data, conflicts will inevitably arise. Remember, there is no need to panic! If a conflict exists, you will be presented with the Conflict Resolution dialog when updating your project.


The Conflict Resolution screen

Here, you will be informed of each individual conflict, and be presented with different options to resolve each individual conflict. For any single conflict, you can select Skip Asset (which will not download that asset from the server), Discard My Changes (which will completely overwrite your local version of the asset) or Ignore Server Changes (which will ignore the changes others made to the asset and after this update you will be able to commit your local changes over server ones) for each individual conflict. Additionally, you can select Merge for text assets like scripts to merge the server version with the local version.

Note: If you choose to discard your changes, the asset will be updated to the latest version from the server (i.e., it will incorporate other users' changes that have been made while you were working). If you want to get the asset back as it was when you started working, you should revert to the specific version that you checked out. (See Browsing revision history and reverting assets below.)

If you run into a conflict while you are committing your local changes, Unity will refuse to commit your changes and inform you that a conflict exists. To resolve the conflicts, select Update. Your local changes will not automatically be overwritten. At this point you will see the Conflict Resolution dialog, and can follow the instructions in the above paragraph.

Browsing revision history and reverting assets

The Asset Server retains all uploaded versions of an asset in its database, so you can revert your local version to an earlier version at any time. You can either select to restore the entire project or single files. To revert to an older version of an asset or a project, select the Overview tab then click Show History listed under Asset Server Actions. You will now see a list of all commits and be able to select and restore any file or all project to an older version.


The History dialog

Here, you can see the version number and added comments with each version of the asset or project. This is one reason why descriptive comments are helpful. Select any asset to see its history or Entire Project for all changes made in project. Find revision you need. You can either select whole revision or particular asset in revision. Then click Download Selected File to get your local asset replaced with a copy of the selected revision. Revert All Project will revert entire project to selected revision.

Prior to reverting, if there are any differences between your local version and the selected server version, those changes will be lost when the local version is reverted.

If you only want to abandon the changes made to the local copy, you don't have to revert. You can discard those local modifications by selecting Discard Changes in the main asset server window. This will immediately download the current version of the project from the server to your local Project.

Comparing asset versions

If you're curious to see the differences between two particular versions you can explicitly compare them. To do this, open History window, select revision and asset you want to compare and press Compare to Local Version. If you need to compare two different revisions of an asset - right click on it, in the context menu select Compare to Another Revision then find revision you want to compare to and select it.

Note: this feature requires that you have one of supported file diff/merge tools installed. Supported tools are:

Recovering deleted assets

Deleting a local asset and committing the delete to the server will in fact not delete an asset permanently. Just as any previous version of an asset can be restored through History window from the Overview tab.


The History dialog

Expand Deleted Assets item, find and select assets from the list and hit Recover, the selected assets will be downloaded and re-added to the local project. If the folder that the asset was located in before the deletion still exists, the asset will be restored to the original location, otherwise it will be added to the root of the Assets folder in the local project.

Best Practices & Common Issues

This is a compilation of best practices and solutions to problems which will help you when using the Asset Server:

  1. Backup, Backup, Backup
    • Maintain a backup of your database. It is very important to do this. In the unfortunate case that you have a hardware problem, a virus, a user error, etc you may loose all of your work. Therefore make sure you have a backup system in place. You can find lots of resources online for setting up backup systems.
  2. Stop the server before shutting the machine down
    • This can prevent "fast shutdowns" from being generated in the PostgreSQL (Asset Server) log. If this occurs the Asset Server has to do a recovery due to an improper shut down. This can take a very long time if you have a large project with many commits.
  3. Resetting you password from Console
    • You can reset your password directly from a shell, console or command line using the following command:

      psql -U unitysrv -d template1 -c"alter role admin with password 'MYPASSWORD'"
  4. Can't connect to Asset Server
    • The password may have expired. Try resetting your password.
    • Also the username is case sensitive: "Admin" != "admin". Make sure you are using the correct case.
    • Make sure the server is actually running:
      • On OS X or Linux you can type on the terminal: ps -aux
      • On Windows you can use the Task Manager.
    • Verify that the Asset Server is not running on more than one computer in your Network. You could be connecting to the wrong one.
  5. The Asset Server doesn't work in 64-bit Linux
    • The asset server can run OK on 64-bit Linux machines if you install 32-bit versions of the required packages. You can use "dpkg -i --force-architecture" to do this.
  6. Use the Asset Server logs to get more information
    • Windows:
      • \Unity\AssetServer\log
    • OS X:
      • /Library/UnityAssetServer/log

Asset Server training complete

You should now be equipped with the knowledge you need to start using the Asset Server effectively. Get to it, and don't forget the good workflow fundamentals. Commit changes often, and don't be afraid of losing anything.

Page last updated: 2011-10-31



Asset Cache Server

Unity has a completely automatic asset pipeline. Whenever a source asset like a .psd or an .fbx file is modified, Unity will detect the change and automatically reimport it. The imported data from the file is subsequently stored by Unity in its own internal format. The best parts about the asset pipeline are the "hot reloading" functionality and the guarantee that all your source assets are always in sync with what you see. This feature also comes at a cost. Any asset that is modified has to be reimported right away. When working in large teams, after getting latest from Source Control, you often have to wait for a long time to re-import all the assets modified or created by other team members. Also, switching your project platform back and forth between desktop and mobile will trigger a re-import of most assets.

The time it takes to import assets can be drastically reduced by caching the imported asset data on the Cache Server.

Each asset import is cached based on

If any of the above change, the asset gets reimported, otherwise it gets downloaded from the Cache Server.

When you enable the cache server in the preferences, you can even share asset imports across multiple projects.

Note that once the cache server is set up, this process is completely automatic, which means there are no additional workflow requirements. It will simply reduce the time it takes to import projects without getting in your way.

How to set up a Cache Server (user)

Setting up the Cache Server couldn't be easier. All you need to do is click Use Cache Server in the preferences and tell the local machine's Unity Editor where the Cache Server is.

This can be found in Unity->Preferences on the Mac or Edit->Preferences on the PC.

If you are hosting the Cache Server on your local machine, specify localhost for the server address. However, due to hard drive size limitations, it is recommended you host the Cache Server on separate machine.

How to set up a Cache Server (admin)

Admins need to set up the Cache Server machine that will host the cached assets.

You need to:

The Cache Server needs to be on a reliable machine with very large storage (much larger than the size of the project itself, as there will likely be multiple versions of imported resources stored). If the hard disk becomes full the Cache Server could perform slowly.

Installing the Cache Server as a service

The provided .sh and .cmd scripts should be set-up as a service on the server. The cache server can be safely killed and restarted at any time, since it uses atomic file operations.

Cache Server Configuration

If you simply start the Cache Server by double clicking the script, it will create a "cache" directory next to the script, and keep its data in there. The cache directory is allowed to grow to up to 50 GB. You can configure the size and the location of the data using command line options, like this:

./RunOSX.command --path ~/mycachePath --size 2000000000

--path lets you specify a cache location, and --size lets you specify the maximum cache size in bytes.

Recommendations for the machine hosting the Cache Server

We recommend equipping the machine with a lot of RAM. For best performance there is enough RAM to hold an entire imported project folder. In addition, it is best to have a machine with a fast hard drive and fast Ethernet connection. The hard drive should also have sufficient free space. On the other hand, the Cache Server has very low CPU usage.

One of the main distinctions between the Cache Server and version control is that its cached data can always be rebuilt locally. It is simply a tool for improving performance. For this reason it doesn't make sense to use a Cache Server over the Internet. If you have a distributed team, we recommend that you place a separate cache server in each location.

We recommend that you run the cache server on a Linux or Mac OS X machine. The Windows file system is not particularly well optimized for how the Asset Cache Server stores data and problems with file locking on Windows can cause issues that don't occur on Linux or Mac OS X.

Page last updated: 2012-10-26



Cache Server FAQ

Will the size of my Cache Server database grow indefinitely as more and more resources get imported and stored?

The Cache Server removes assets that have not been used for a period of time automatically (of course if those assets are needed again, they will be re-created during next usage).

Does the cache server work only with the asset server?

The cache server is designed to be transparent to source/version control systems and so you are not restricted to using Unity's asset server.

What changes will cause the imported file to get regenerated?

When Unity is about to import an asset, it generates an MD5 hash of all source data.

For a texture this consists of:

If that hash is different from what is stored on the Cache Server, the asset will be reimported, otherwise the cached version will be downloaded. The client Unity editor will only pull assets from the server as they are needed - assets don't get pushed to each project as they change.

How do I work with Asset dependencies?

The Cache Server does not handle dependencies. Unity's asset pipeline does not deal with the concept of dependencies. It is built in such a way as to avoid dependencies between assets. AssetPostprocessors are a common technique used to customize the Asset importer to fit your needs. For example, you might want to add MeshColliders to some GameObjects in an fbx file based on their name or tag.

It is also easy to use AssetPostprocessors to introduce dependencies. For example you might use data from a text file next to the asset to add additional components to the imported game objects. This is not supported in the Cache Server. If you want to use the Cache Server, you will have to remove dependency on other assets in the project folder. Since the Cache Server doesn't know anything about the dependency in your postprocessor, it will not know that anything has changed thus use an old cached version of the asset.

In practice there are plenty of ways you can do asset postprocessing to work well with the cache server. You can use:

Are there any issues when working with materials?

Modifying materials that already exist might cause trouble. When using the Cache Server, Unity validates that the references to materials are maintained. But since no postprocessing calls will be invoked, the contents of the material can not be changed when a model is imported through the Cache Server. Thus you might get different results when importing with or without Cache Server. It is best to never modify materials that already exist on disk.

Are there any asset types which will not be cached by the server?

There are a few kinds of asset data which the server doesn't cache. There isn't really anything to be gained by caching script files and so the server will ignore them. Also, native files used by 3D modelling software (Maya, 3D Max, etc) are converted to FBX using the application itself. Currently, the asset server caches neither the native file nor the intermediate FBX file generated in the import process. However, it is possible to benefit from the server by exporting files as FBX from the modelling software and adding those to the Unity project.

Page last updated: 2012-09-04



Behind the Scenes

Unity automatically imports assets and manages various kinds of additional data about them for you. Below is a description of how this process works.

When you place an Asset such as a texture in the Assets folder, Unity will first detect that a new file has been added (the editor frequently checks the contents of the Assets folder against the list of assets it already knows about). Once a unique ID value has been assigned to the asset to enable it to be accessed internally, it will be imported and processed. The asset that you actually see in the Project panel is the result of that processing and its data contents will typically be different to those of the original asset. For example, a texture may be present in the Assets folder as a PNG file but will be converted to an internal format after import and processing.

Using an internal format for assets allows Unity to keep additional data known as metadata which enables the asset data to be handled in a much more flexible way. For example, the Photoshop file format is convenient to work with, but you wouldn't expect it to support game engine features such as mip maps. Unity's internal format, however, can add extra functionality like this to any asset type. All metadata for assets is stored in the Library folder. As as user, you should never have to alter the Library folder manually and attempting to do so may corrupt the project.

Unity allows you to create folders in the Project view to help you organize assets, and those folders will be mirrored in the actual filesystem. However, you must move the files within Unity by dragging and dropping in the Project view. If you attempt to use the filesystem/desktop to move the files then Unity will misinterpret the change (it will appear that the old asset has been deleted and a new one created in its place). This will lose information, such as links between assets and scripts in the project.

When backing up a project, you should always back up the main Unity project folder, containing both the Assets and Library folders. All the information in the subfolders is crucial to the way Unity works.

Page last updated: 2011-11-16



Creating Gameplay

Unity は、ゲーム デザイナーにゲームを作成する力を与えます。 Unity の特別な点は、何年ものコード記述の経験や、面白いゲームを作るための美術の学位は不要という点です。 Unity を学ぶのに必要な基本的なワークフローのコンセプトはわずかです。 一旦理解すれば、すぐにゲーム作成に移ることができるでしょう。 ゲームの起動や実行にかかる時間w節約できるので、その分ゲームをより完璧に仕上げるために、改良やバランス調整、微調整などに時間を回すことができます。

本項では、独特で、驚くべき、楽しいゲームプレイを作成するために必要な主要コンセプトについて説明します。 このコンセプトの大半で、 Scripts を記述する必要があります。 スクリプトの作成および使用の概要については、Scripting ページを参照してください。

Page last updated: 2012-11-09



Instantiating Prefabs

By this point you should understand the concept of Prefabs at a fundamental level. They are a collection of predefined GameObjects & Components that are re-usable throughout your game. If you don't know what a Prefab is, we recommend you read the Prefabs page for a more basic introduction.

Prefabs come in very handy when you want to instantiate complicated GameObjects at runtime. The alternative to instantiating Prefabs is to create GameObjects from scratch using code. Instantiating Prefabs has many advantages over the alternative approach:

Common Scenarios

To illustrate the strength of Prefabs, let's consider some basic situations where they would come in handy:

  1. Building a wall out of a single "brick" Prefab by creating it several times in different positions.
  2. A rocket launcher instantiates a flying rocket Prefab when fired. The Prefab contains a Mesh, Rigidbody, Collider, and a child GameObject with its own trail Particle System.
  3. A robot exploding to many pieces. The complete, operational robot is destroyed and replaced with a wrecked robot Prefab. This Prefab would consist of the robot split into many parts, all set up with Rigidbodies and Particle Systems of their own. This technique allows you to blow up a robot into many pieces, with just one line of code, replacing one object with a Prefab.

Building a wall

This explanation will illustrate the advantages of using a Prefab vs creating objects from code.

First, lets build a brick wall from code:

// JavaScript
function Start () {
    for (var y = 0; y < 5; y++) {
        for (var x = 0; x < 5; x++) {
            var cube = GameObject.CreatePrimitive(PrimitiveType.Cube);
            cube.AddComponent(Rigidbody);
            cube.transform.position = Vector3 (x, y, 0);
        }
    }
}


// C#
public class Instantiation : MonoBehaviour {

	void Start() {
		for (int y = 0; y < 5; y++) {
			for (int x = 0; x < 5; x++) {
				GameObject cube = GameObject.CreatePrimitive(PrimitiveType.Cube);
				cube.AddComponent<Rigidbody>();
				cube.transform.position = new Vector3(x, y, 0);
			}
		}
	}
}

If you execute that code, you will see an entire brick wall is created when you enter Play Mode. There are two lines relevant to the functionality of each individual brick: the CreatePrimitive() line, and the AddComponent() line. Not so bad right now, but each of our bricks is un-textured. Every additional action to want to perform on the brick, like changing the texture, the friction, or the Rigidbody mass, is an extra line.

If you create a Prefab and perform all your setup before-hand, you use one line of code to perform the creation and setup of each brick. This relieves you from maintaining and changing a lot of code when you decide you want to make changes. With a Prefab, you just make your changes and Play. No code alterations required.

If you're using a Prefab for each individual brick, this is the code you need to create the wall.

// JavaScript

var brick : Transform;
function Start () {
    for (var y = 0; y < 5; y++) {
        for (var x = 0; x < 5; x++) {
            Instantiate(brick, Vector3 (x, y, 0), Quaternion.identity);
        }
    }
}


// C#
public Transform brick;

void Start() {
	for (int y = 0; y < 5; y++) {
		for (int x = 0; x < 5; x++) {
			Instantiate(brick, new Vector3(x, y, 0), Quaternion.identity);
		}
	}
}

This is not only very clean but also very reusable. There is nothing saying we are instantiating a cube or that it must contain a rigidbody. All of this is defined in the Prefab and can be quickly created in the Editor.

Now we only need to create the Prefab, which we do in the Editor. Here's how:

  1. Choose GameObject->Create Other->Cube
  2. Choose Component->Physics->Rigidbody
  3. Choose Assets->Create->Prefab
  4. In the Project View, change the name of your new Prefab to "Brick"
  5. Drag the cube you created in the Hierarchy onto the "Brick" Prefab in the Project View
  6. With the Prefab created, you can safely delete the Cube from the Hierarchy (Delete on Windows, Command-Backspace on Mac)

We've created our Brick Prefab, so now we have to attach it to the brick variable in our script. Select the empty GameObject that contains the script. Notice that a new variable has appeared in the Inspector, called "brick".


This variable can accept any GameObject or Prefab

Now drag the "Brick" Prefab from the Project View onto the brick variable in the Inspector. Press Play and you'll see the wall built using the Prefab.

This is a workflow pattern that can be used over and over again in Unity. In the beginning you might wonder why this is so much better, because the script creating the cube from code is only 2 lines longer.

But because you are using a Prefab now, you can adjust the Prefab in seconds. Want to change the mass of all those instances? Adjust the Rigidbody in the Prefab only once. Want to use a different Material for all the instances? Drag the Material onto the Prefab only once. Want to change friction? Use a different Physic Material in the Prefab's collider. Want to add a Particle System to all those boxes? Add a child to the Prefab only once.

Instantiating rockets & explosions

Here's how Prefabs fit into this scenario:

  1. A rocket launcher instantiates a rocket Prefab when the user presses fire. The Prefab contains a mesh, Rigidbody, Collider, and a child GameObject that contains a trail particle system.
  2. The rocket impacts and instantiates an explosion Prefab. The explosion Prefab contains a Particle System, a light that fades out over time, and a script that applies damage to surrounding GameObjects.

While it would be possible to build a rocket GameObject completely from code, adding Components manually and setting properties, it is far easier to instantiate a Prefab. You can instantiate the rocket in just one line of code, no matter how complex the rocket's Prefab is. After instantiating the Prefab you can also modify any properties of the instantiated object (e.g. you can set the velocity of the rocket's Rigidbody).

Aside from being easier to use, you can update the prefab later on. So if you are building a rocket, you don't immediately have to add a Particle trail to it. You can do that later. As soon as you add the trail as a child GameObject to the Prefab, all your instantiated rockets will have particle trails. And lastly, you can quickly tweak the properties of the rocket Prefab in the Inspector, making it far easier to fine-tune your game.

This script shows how to launch a rocket using the Instantiate() function.

// JavaScript

// Require the rocket to be a rigidbody.
// This way we the user can not assign a prefab without rigidbody
var rocket : Rigidbody;
var speed = 10.0;

function FireRocket () {
    var rocketClone : Rigidbody = Instantiate(rocket, transform.position, transform.rotation);
    rocketClone.velocity = transform.forward * speed;
    // You can also acccess other components / scripts of the clone
    rocketClone.GetComponent(MyRocketScript).DoSomething();
}

// Calls the fire method when holding down ctrl or mouse
function Update () {
    if (Input.GetButtonDown("Fire1")) {
        FireRocket();
    }
}


// C#

// Require the rocket to be a rigidbody.
// This way we the user can not assign a prefab without rigidbody
public Rigidbody rocket;
public float speed = 10f;

void FireRocket () {
	Rigidbody rocketClone = (Rigidbody) Instantiate(rocket, transform.position, transform.rotation);
	rocketClone.velocity = transform.forward * speed;

	// You can also acccess other components / scripts of the clone
	rocketClone.GetComponent<MyRocketScript>().DoSomething();
}

// Calls the fire method when holding down ctrl or mouse
void Update () {
	if (Input.GetButtonDown("Fire1")) {
		FireRocket();
	}
}

Replacing a character with a ragdoll or wreck

Let's say you have a fully rigged enemy character and he dies. You could simply play a death animation on the character and disable all scripts that usually handle the enemy logic. You probably have to take care of removing several scripts, adding some custom logic to make sure that no one will continue attacking the dead enemy anymore, and other cleanup tasks.

A far better approach is to immediately delete the entire character and replace it with an instantiated wrecked prefab. This gives you a lot of flexibility. You could use a different material for the dead character, attach completely different scripts, spawn a Prefab containing the object broken into many pieces to simulate a shattered enemy, or simply instantiate a Prefab containing a version of the character.

Any of these options can be achieved with a single call to Instantiate(), you just have to hook it up to the right prefab and you're set!

The important part to remember is that the wreck which you Instantiate() can be made of completely different objects than the original. For example, if you have an airplane, you would model two versions. One where the plane consists of a single GameObject with Mesh Renderer and scripts for airplane physics. By keeping the model in just one GameObject, your game will run faster since you will be able to make the model with less triangles and since it consists of fewer objects it will render faster than using many small parts. Also while your plane is happily flying around there is no reason to have it in separate parts.

To build a wrecked airplane Prefab, the typical steps are:

  1. Model your airplane with lots of different parts in your favorite modeler
  2. Create an empty Scene
  3. Drag the model into the empty Scene
  4. Add Rigidbodies to all parts, by selecting all the parts and choosing Component->Physics->Rigidbody
  5. Add Box Colliders to all parts by selecting all the parts and choosing Component->Physics->Box Collider
  6. For an extra special effect, add a smoke-like Particle System as a child GameObject to each of the parts
  7. Now you have an airplane with multiple exploded parts, they fall to the ground by physics and will create a Particle trail due to the attached particle system. Hit Play to preview how your model reacts and do any necessary tweaks.
  8. Choose Assets->Create Prefab
  9. Drag the root GameObject containing all the airplane parts into the Prefab
// JavaScript

var wreck : GameObject;

// As an example, we turn the game object into a wreck after 3 seconds automatically
function Start () {
    yield WaitForSeconds(3);
    KillSelf();
}

// Calls the fire method when holding down ctrl or mouse
function KillSelf () {
    // Instantiate the wreck game object at the same position we are at
    var wreckClone = Instantiate(wreck, transform.position, transform.rotation);

    // Sometimes we need to carry over some variables from this object
    // to the wreck
    wreckClone.GetComponent(MyScript).someVariable = GetComponent(MyScript).someVariable;

    // Kill ourselves
    Destroy(gameObject);


// C#

public GameObject wreck;

// As an example, we turn the game object into a wreck after 3 seconds automatically
IEnumerator Start() {
	yield return new WaitForSeconds(3);
	KillSelf();
}

// Calls the fire method when holding down ctrl or mouse
void KillSelf () {
	// Instantiate the wreck game object at the same position we are at
	GameObject wreckClone = (GameObject) Instantiate(wreck, transform.position, transform.rotation);

	// Sometimes we need to carry over some variables from this object
	// to the wreck
	wreckClone.GetComponent<MyScript>().someVariable = GetComponent<MyScript>().someVariable;

	// Kill ourselves
	Destroy(gameObject);
}

} 

The First Person Shooter tutorial explains how to replace a character with a ragdoll version and also synchronize limbs with the last state of the animation. You can find that tutorial on the Tutorials page.

Placing a bunch of objects in a specific pattern

Lets say you want to place a bunch of objects in a grid or circle pattern. Traditionally this would be done by either:

  1. Building an object completely from code. This is tedious! Entering values from a script is both slow, unintuitive and not worth the hassle.
  2. Make the fully rigged object, duplicate it and place it multiple times in the scene. This is tedious, and placing objects accurately in a grid is hard.

So use Instantiate() with a Prefab instead! We think you get the idea of why Prefabs are so useful in these scenarios. Here's the code necessary for these scenarios:

// JavaScript

// Instantiates a prefab in a circle

var prefab : GameObject;
var numberOfObjects = 20;
var radius = 5;

function Start () {
    for (var i = 0; i < numberOfObjects; i++) {
        var angle = i * Mathf.PI * 2 / numberOfObjects;
        var pos = Vector3 (Mathf.Cos(angle), 0, Mathf.Sin(angle)) * radius;
        Instantiate(prefab, pos, Quaternion.identity);
    }
}


// C#
// Instantiates a prefab in a circle

public GameObject prefab;
public int numberOfObjects = 20;
public float radius = 5f;

void Start() {
	for (int i = 0; i < numberOfObjects; i++) {
		float angle = i * Mathf.PI * 2 / numberOfObjects;
		Vector3 pos = new Vector3(Mathf.Cos(angle), 0, Mathf.Sin(angle)) * radius;
		Instantiate(prefab, pos, Quaternion.identity);
	}
}
// JavaScript

// Instantiates a prefab in a grid

var prefab : GameObject;
var gridX = 5;
var gridY = 5;
var spacing = 2.0;

function Start () {
    for (var y = 0; y < gridY; y++) {
        for (var x=0;x<gridX;x++) {
            var pos = Vector3 (x, 0, y) * spacing;
            Instantiate(prefab, pos, Quaternion.identity);
        }
    }
}


// C#

// Instantiates a prefab in a grid

public GameObject prefab;
public float gridX = 5f;
public float gridY = 5f;
public float spacing = 2f;

void Start() {
	for (int y = 0; y < gridY; y++) {
		for (int x = 0; x < gridX; x++) {
			Vector3 pos = new Vector3(x, 0, y) * spacing;
			Instantiate(prefab, pos, Quaternion.identity);
		}
	}
} 

Page last updated: 2012-10-09



Input

Desktop

Note: Keyboard, joystick and gamepad input work on the desktop versions of Unity (including webplayer and Flash) but not on mobiles.

Unity supports keyboard, joystick and gamepad input.

Virtual axes and buttons can be created in the Input Manager, and end users can configure Keyboard input in a nice screen configuration dialog.

You can setup joysticks, gamepads, keyboard, and mouse, then access them all through one simple scripting interface.

From scripts, all virtual axes are accessed by their name.

Every project has the following default input axes when it's created:

  • Horizontal and Vertical are mapped to w, a, s, d and the arrow keys.
  • Fire1, Fire2, Fire3 are mapped to Control, Option (Alt), and Command, respectively.
  • Mouse X and Mouse Y are mapped to the delta of mouse movement.
  • Window Shake X and Window Shake Y is mapped to the movement of the window.

Adding new Input Axes

If you want to add new virtual axes go to the Edit->Project Settings->Input menu. Here you can also change the settings of each axis.

You map each axis to two buttons on a joystick, mouse, or keyboard keys.

NameThe name of the string used to check this axis from a script.
Descriptive NamePositive value name displayed in the input tab of the Configuration dialog for standalone builds.
Descriptive Negative NameNegative value name displayed in the Input tab of the Configuration dialog for standalone builds.
Negative ButtonThe button used to push the axis in the negative direction.
Positive ButtonThe button used to push the axis in the positive direction.
Alt Negative ButtonAlternative button used to push the axis in the negative direction.
Alt Positive ButtonAlternative button used to push the axis in the positive direction.
GravitySpeed in units per second that the axis falls toward neutral when no buttons are pressed.
DeadSize of the analog dead zone. All analog device values within this range result map to neutral.
SensitivitySpeed in units per second that the the axis will move toward the target value. This is for digital devices only.
SnapIf enabled, the axis value will reset to zero when pressing a button of the opposite direction.
InvertIf enabled, the Negative Buttons provide a positive value, and vice-versa.
TypeThe type of inputs that will control this axis.
AxisThe axis of a connected device that will control this axis.
Joy NumThe connected Joystick that will control this axis.

Use these settings to fine tune the look and feel of input. They are all documented with tooltips in the Editor as well.

Using Input Axes from Scripts

You can query the current state from a script like this:

value = Input.GetAxis ("Horizontal");

An axis has a value between -1 and 1. The neutral position is 0. This is the case for joystick input and keyboard input.

However, Mouse Delta and Window Shake Delta are how much the mouse or window moved during the last frame. This means it can be larger than 1 or smaller than -1 when the user moves the mouse quickly.

It is possible to create multiple axes with the same name. When getting the input axis, the axis with the largest absolute value will be returned. This makes it possible to assign more than one input device to one axis name. For example, create one axis for keyboard input and one axis for joystick input with the same name. If the user is using the joystick, input will come from the joystick, otherwise input will come from the keyboard. This way you don't have to consider where the input comes from when writing scripts.

Button Names

To map a key to an axis, you have to enter the key's name in the Positive Button or Negative Button property in the Inspector.

The names of keys follow this convention:

  • Normal keys: "a", "b", "c" ...
  • Number keys: "1", "2", "3", ...
  • Arrow keys: "up", "down", "left", "right"
  • Keypad keys: "[1]", "[2]", "[3]", "[+]", "[equals]"
  • Modifier keys: "right shift", "left shift", "right ctrl", "left ctrl", "right alt", "left alt", "right cmd", "left cmd"
  • Mouse Buttons: "mouse 0", "mouse 1", "mouse 2", ...
  • Joystick Buttons (from any joystick): "joystick button 0", "joystick button 1", "joystick button 2", ...
  • Joystick Buttons (from a specific joystick): "joystick 1 button 0", "joystick 1 button 1", "joystick 2 button 0", ...
  • Special keys: "backspace", "tab", "return", "escape", "space", "delete", "enter", "insert", "home", "end", "page up", "page down"
  • Function keys: "f1", "f2", "f3", ...

The names used to identify the keys are the same in the scripting interface and the Inspector.

value = Input.GetKey ("a");

Mobile Input

On iOS and Android, the Input class offers access to touchscreen, accelerometer and geographical/location input.

Access to keyboard on mobile devices is provided via the iOS keyboard.

Multi-Touch Screen

The iPhone and iPod Touch devices are capable of tracking up to five fingers touching the screen simultaneously. You can retrieve the status of each finger touching the screen during the last frame by accessing the Input.touches property array.

Android devices don't have a unified limit on how many fingers they track. Instead, it varies from device to device and can be anything from two-touch on older devices to five fingers on some newer devices.

Each finger touch is represented by an Input.Touch data structure:

fingerIdThe unique index for a touch.
positionThe screen position of the touch.
deltaPositionThe screen position change since the last frame.
deltaTimeAmount of time that has passed since the last state change.
tapCountThe iPhone/iPad screen is able to distinguish quick finger taps by the user. This counter will let you know how many times the user has tapped the screen without moving a finger to the sides. Android devices do not count number of taps, this field is always 1.
phaseDescribes so called "phase" or the state of the touch. It can help you determine if the touch just began, if user moved the finger or if he just lifted the finger.

Phase can be one of the following:

BeganA finger just touched the screen.
MovedA finger moved on the screen.
StationaryA finger is touching the screen but hasn't moved since the last frame.
EndedA finger was lifted from the screen. This is the final phase of a touch.
CanceledThe system cancelled tracking for the touch, as when (for example) the user puts the device to her face or more than five touches happened simultaneously. This is the final phase of a touch.

Following is an example script which will shoot a ray whenever the user taps on the screen:

var particle : GameObject;
function Update () {
	for (var touch : Touch in Input.touches) {
		if (touch.phase == TouchPhase.Began) {
			// Construct a ray from the current touch coordinates
			var ray = Camera.main.ScreenPointToRay (touch.position);
			if (Physics.Raycast (ray)) {
				// Create a particle if hit
				Instantiate (particle, transform.position, transform.rotation);
			}
		}
	}
}

Mouse Simulation

On top of native touch support Unity iOS/Android provides a mouse simulation. You can use mouse functionality from the standard Input class.

Device Orientation

Unity iOS/Android allows you to get discrete description of the device physical orientation in three-dimensional space. Detecting a change in orientation can be useful if you want to create game behaviors depending on how the user is holding the device.

You can retrieve device orientation by accessing the Input.deviceOrientation property. Orientation can be one of the following:

UnknownThe orientation of the device cannot be determined. For example when device is rotate diagonally.
PortraitThe device is in portrait mode, with the device held upright and the home button at the bottom.
PortraitUpsideDownThe device is in portrait mode but upside down, with the device held upright and the home button at the top.
LandscapeLeftThe device is in landscape mode, with the device held upright and the home button on the right side.
LandscapeRightThe device is in landscape mode, with the device held upright and the home button on the left side.
FaceUpThe device is held parallel to the ground with the screen facing upwards.
FaceDownThe device is held parallel to the ground with the screen facing downwards.

Accelerometer

As the mobile device moves, a built-in accelerometer reports linear acceleration changes along the three primary axes in three-dimensional space. Acceleration along each axis is reported directly by the hardware as G-force values. A value of 1.0 represents a load of about +1g along a given axis while a value of -1.0 represents -1g. If you hold the device upright (with the home button at the bottom) in front of you, the X axis is positive along the right, the Y axis is positive directly up, and the Z axis is positive pointing toward you.

You can retrieve the accelerometer value by accessing the Input.acceleration property.

The following is an example script which will move an object using the accelerometer:

var speed = 10.0;
function Update () {
	var dir : Vector3 = Vector3.zero;

	// we assume that the device is held parallel to the ground
	// and the Home button is in the right hand

	// remap the device acceleration axis to game coordinates:
	//  1) XY plane of the device is mapped onto XZ plane
	//  2) rotated 90 degrees around Y axis
	dir.x = -Input.acceleration.y;
	dir.z = Input.acceleration.x;

	// clamp acceleration vector to the unit sphere
	if (dir.sqrMagnitude > 1)
		dir.Normalize();

	// Make it move 10 meters per second instead of 10 meters per frame...
	dir *= Time.deltaTime;

	// Move object
	transform.Translate (dir * speed);
}

Low-Pass Filter

Accelerometer readings can be jerky and noisy. Applying low-pass filtering on the signal allows you to smooth it and get rid of high frequency noise.

The following script shows you how to apply low-pass filtering to accelerometer readings:

var AccelerometerUpdateInterval : float = 1.0 / 60.0;
var LowPassKernelWidthInSeconds : float = 1.0;

private var LowPassFilterFactor : float = AccelerometerUpdateInterval / LowPassKernelWidthInSeconds; // tweakable
private var lowPassValue : Vector3 = Vector3.zero;
function Start () {
	lowPassValue = Input.acceleration;
}

function LowPassFilterAccelerometer() : Vector3 {
	lowPassValue = Mathf.Lerp(lowPassValue, Input.acceleration, LowPassFilterFactor);
	return lowPassValue;
}

The greater the value of LowPassKernelWidthInSeconds, the slower the filtered value will converge towards the current input sample (and vice versa). You should be able to use the LowPassFilter() function instead of avgSamples().

I'd like as much precision as possible when reading the accelerometer. What should I do?

Reading the Input.acceleration variable does not equal sampling the hardware. Put simply, Unity samples the hardware at a frequency of 60Hz and stores the result into the variable. In reality, things are a little bit more complicated -- accelerometer sampling doesn't occur at consistent time intervals, if under significant CPU loads. As a result, the system might report 2 samples during one frame, then 1 sample during the next frame.

You can access all measurements executed by accelerometer during the frame. The following code will illustrate a simple average of all the accelerometer events that were collected within the last frame:

var period : float = 0.0;
var acc : Vector3 = Vector3.zero;
for (var evnt : iPhoneAccelerationEvent  in iPhoneInput.accelerationEvents) {
	acc += evnt.acceleration * evnt.deltaTime;
	period += evnt.deltaTime;
}
if (period > 0)
	acc *= 1.0/period;
return acc;

Further Reading

The Unity mobile input API is originally based on Apple's API. It may help to learn more about the native API to better understand Unity's Input API. You can find the Apple input API documentation here:

Note: The above links reference your locally installed iPhone SDK Reference Documentation and will contain native ObjectiveC code. It is not necessary to understand these documents for using Unity on mobile devices, but may be helpful to some!

iOS

Device geographical location

Device geographical location can be obtained via the iPhoneInput.lastLocation property. Before calling this property you should start location service updates using iPhoneSettings.StartLocationServiceUpdates() and check the service status via iPhoneSettings.locationServiceStatus. See the scripting reference for details.

Page last updated: 2012-06-28



Transforms

Transforms are a key Component in every GameObject. They dictate where the GameObject is positioned, how it is rotated, and its scale. It is impossible to have a GameObject without a Transform. You can adjust the Transform of any GameObject from the Scene View, the Inspector, or through Scripting.

The remainder of this page's text is from the Transform Component Reference page.

トランスフォーム

Transform Componentは、シーン内のすべてのオブジェクトの実際の「Position」、「Rotation」および「Scale」を決定します。 オブジェクトはそれぞれトランスフォームを持ちます。


Inspector 表示・編集可能なトランスフォーム コンポーネント」

プロパティ

PositionX、Y、Z 座標でのトランスフォームの位置。
RotationX、Y、Z 軸周辺でのトランスフォームの回転 (単位:度)。
ScaleX、Y、Z 軸に沿ったトランスフォームのスケール。1の場合は、元の大きさになります (オブジェクトがインポートされた大きさ)。

トランスフォームのプロパティはすべて、トランスフォームの親に対して相対的に設定されます(詳細は以下で参照下さい)。 トランスフォームに親がない場合、プロパティはワールド空間にもとづき設定されます。

トランスフォームの使用

トランスフォームは、X、Y、Z軸を使用して、3D 空間で操作されます。Unity では、これらの軸は、それぞれ赤色、緑色、青色で表示されます。 XYZ = RGBと覚えてださい。


「3 つの軸間の色記号とトランスフォーム プロパティの関係」

トランスフォーム コンポーネントは、Scene View またはインスペクタタ上のプロパティを編集して、直接操作できます。シーンでは、Move、Rotate およびScale ツールを使用して、トランスフォームを操作できます。 これらのツールは、Unity エディタの左上にあります。


「View、Translat、Rotate および Scale ツール」

これらのツールは、シーン内のどのオブジェクトにも使用できます。 オブジェクトをクリックすると、そこにツールのギズモが表示されます。 現在どのツールを選択しているかにより、このギズモの表示が若干異なります。



3 つのギズモはすべてシーン ビューで直接編集できます。

トランスフォームを操作するには、3 つのギズモの軸のいずれかをクリックして、ドラッグすると、色が変わります。 マウスをドラッグするのに合わせ、オブジェクトが軸に沿って、移動、回転または縮小拡大します。 マウス ボタンを放すと、軸が選択されたままになります。 マウスの中ボタンをクリックし、マウスをドラッグして、選択した軸に沿って、トランスフォームを操作できます。


「個々の軸をクリックすると、選択されます」

親子関係

親子関係は、Unity を使用する際に理解する必要のある最も重要なコンセプトのひとつです。 GameObject が別の GameObject の Parent(親)の場合、Child(子)GameObject は、親とまったく同じように移動、回転、縮小拡大します。 体に付いている腕のように、体を回転させると、体に付いているため、腕も動きます。 どのオブジェクトにも複数の子がありますが、親は 1 つだけです。

Hierarchy View 内の GameObject を別の GameObject にドラッグすることで、親を作成できます。 これにより、2 つの GameObject 間で親子関係が作成されます。


「親子階層の例。 左の矢印で示されるすべての GameObject は親です。」

上記の例では、体が腕の親になっており、腕は手の親になっています。 Unity で作成したシーンには、これらの Transform hierarchy の集合が含まれます。 一番上の親オブジェクトは、Root object と呼ばれます。 親を移動、縮小拡大または回転させると、そのトランスフォームでの変更はすべてその子にも適用されます。

子 GameObject のインスペクタでのトランスフォームの値は、親のトランスフォームの値に対し相対的に表示されるということはポイントです。これらは Local Coordinate(ローカル座標)と呼ばれます。 スクリプティングを通じて、ローカル座標の他、 Global Coordinate(グローバル座標)にもアクセスできます。

いくつかの別々のオブジェクトに親子関係を持たせることにより、人体模型の骸骨の構造のように、複合的なオブジェクトを作成できます。 また、シンプルな構造でも便利な効果得られます。例えば、舞台設定が夜中であるホラー ゲームで、懐中電灯を活用した場面を作りたいとします。このオブジェクトを作成するには、spotlightトランスフォームを懐中電灯トランスフォームの親にします。 これで、懐中電灯トランスフォームに変更を加えると、spotlightも変更され、リアルな懐中電灯の効果を生み出すことができます。

不均等なScaleによるパフォーマンス問題や制限

不均等なScaleとは、トランスフォームにおけるScaleがx、y、z方向で異なる値があるケースです(例:(2, 4, 2))。対比となるのが均等であるScaleで、x、y、z方向で同じ値があるケースです(例:(3, 3, 3))。均等でないScaleは限定されたケースでは便利かもしれませんが、通常は出来るかぎり避けるべきです。

不均等なScaleはレンダリングのパフォーマンスにマイナスのインパクトがあります。頂点法線を正しく変換するため、CPU上でメッシュを変換しデータの複製をします。通常はインスタンス間で共有されるメッシュはグラフィックスメモリに保持しますが、このケースではCPUとメモリ双方のコストがインスタンスごとに発生します。

Unityが不均等なScaleを扱う場合、特定の制限事項もあります。

スケールの重要性

トランスフォームのスケールは、モデリング アプリケーションのメッシュのサイズと、Unity でのメッシュのサイズ間の差分を決定します。 Unity でのメッシュのサイズ (およびトランスフォームのスケール)は、特に物理挙動のシミュレーション中には非常に重要になります。次の3つの要因によって、オブジェクトのスケールが決まります。

理想的には、トランスフォームコンポーネントでのオブジェクトの「スケール」を調整する必要はありません。 実際のスケールでモデルを作成するため、トランスフォームのスケールを変更する必要がないというのが最高の選択肢です。 個々のメッシュに対して、Import Settings にメッシュをインポートしたスケールを調整するのが、次に良い選択肢になります。インポート サイズに基づいて、一定の最適化が行われ、調整されたスケール値を持つオブジェクトをインスタンス化すると、パフォーマンスが下がる場合があります。詳細については、Rigidbody コンポーネントのスケールの最適化に関する項目を参照してください。

ヒント

Page last updated: 2007-11-16



Physics

Unity has NVIDIA PhysX physics engine built-in. This allows for unique emergent behaviour and has many useful features.

Basics

To put an object under physics control, simply add a Rigidbody to it. When you do this, the object will be affected by gravity, and can collide with other objects in the world.

Rigidbodies

Rigidbodies are physically simulated objects. You use Rigidbodies for things that the player can push around, for example crates or loose objects, or you can move Rigidbodies around directly by adding forces to it by scripting.

If you move the Transform of a non-Kinematic Rigidbody directly it may not collide correctly with other objects. Instead you should move a Rigidbody by applying forces and torque to it. You can also add Joints to rigidbodies to make the behavior more complex. For example, you could make a physical door or a crane with a swinging chain.

You also use Rigidbodies to bring vehicles to life, for example you can make cars using a Rigidbody, 4 Wheel Colliders and a script applying wheel forces based on the user's Input.

You can make airplanes by applying forces to the Rigidbody from a script. Or you can create special vehicles or robots by adding various Joints and applying forces via scripting.

Rigidbodies are most often used in combination with primitive colliders.

Tips:

Kinematic Rigidbodies

A Kinematic Rigidbody is a Rigidbody that has the isKinematic option enabled. Kinematic Rigidbodies are not affected by forces, gravity or collisions. They are driven explicitly by setting the position and rotation of the Transform or animating them, yet they can interact with other non-Kinematic Rigidbodies.

Kinematic Rigidbodies correctly wake up other Rigidbodies when they collide with them, and they apply friction to Rigidbodies placed on top of them.

These are a few example uses for Kinematic Rigidbodies:

  1. Sometimes you want an object to be under physics control but in another situation to be controlled explicitly from a script or animation. For example you could make an animated character whose bones have Rigidbodies attached that are connected with joints for use as a Ragdoll. Most of the time the character is under animation control, thus you make the Rigidbody Kinematic. But when he gets hit you want him to turn into a Ragdoll and be affected by physics. To accomplish this, you simply disable the isKinematic property.
  2. Sometimes you want a moving object that can push other objects yet not be pushed itself. For example if you have an animated platform and you want to place some Rigidbody boxes on top, you should make the platform a Kinematic Rigidbody instead of just a Collider without a Rigidbody.
  3. You might want to have a Kinematic Rigidbody that is animated and have a real Rigidbody follow it using one of the available Joints.

Static Colliders

A Static Collider is a GameObject that has a Collider but not a Rigidbody. Static Colliders are used for level geometry which always stays at the same place and never moves around. You can add a Mesh Collider to your already existing graphical meshes (even better use the Import Settings Generate Colliders check box), or you can use one of the other Collider types.

You should never move a Static Collider on a frame by frame basis. Moving Static Colliders will cause an internal recomputation in PhysX that is quite expensive and which will result in a big drop in performance. On top of that the behaviour of waking up other Rigidbodies based on a Static Collider is undefined, and moving Static Colliders will not apply friction to Rigidbodies that touch it. Instead, Colliders that move should always be Kinematic Rigidbodies.

Character Controllers

You use Character Controllers if you want to make a humanoid character. This could be the main character in a third person platformer, FPS shooter or any enemy characters.

These Controllers don't follow the rules of physics since it will not feel right (in Doom you run 90 miles per hour, come to halt in one frame and turn on a dime). Instead, a Character Controller performs collision detection to make sure your characters can slide along walls, walk up and down stairs, etc.

Character Controllers are not affected by forces but they can push Rigidbodies by applying forces to them from a script. Usually, all humanoid characters are implemented using Character Controllers.

Character Controllers are inherently unphysical, thus if you want to apply real physics - Swing on ropes, get pushed by big rocks - to your character you have to use a Rigidbody, this will let you use joints and forces on your character. Character Controllers are always aligned along the Y axis, so you also need to use a Rigidbody if your character needs to be able to change orientation in space (for example under a changing gravity). However, be aware that tuning a Rigidbody to feel right for a character is hard due to the unphysical way in which game characters are expected to behave. Another difference is that Character Controllers can slide smoothly over steps of a specified height, while Rigidbodies will not.

If you parent a Character Controller with a Rigidbody you will get a "Joint" like behavior.

リジッドボディ

Rigidbody により、 GameObject が物理特性の制御下で動作するようになります。 リジッドボディは、力やトルクを受け、現実的な方向にオブジェクトを動かすことができます。 GameObject は、重力の影響を影響を受けるリジッドボディを含めるか、スクリプティングを通じて加えた力の下で動作するか、NVIDIA PhysX 物理特性エンジンを通じて、その他のオブジェクトと相互作用する必要があります。


「リジッドボディにより、GameObject は物理的影響の下で動作できます」

プロパティ

Massオブジェクトの質量 (単位 : kg)。 質量をその他のリジッドボディの 100 倍 にしておくことをお勧めします。
Drag力により動く際に、オブジェクトに影響する空気抵抗の量。 0 の場合、空気抵抗が 0 で、無限の場合、オブジェクトは直ちに動きを止めます。
Angular Dragトルクにより回転する際に、オブジェクトに影響する空気抵抗の量。 0 の場合、空気抵抗が 0 で、無限の場合、オブジェクトは直ちに回転を止めます。
Use Gravity有効にすると、オブジェクトは重力の影響を受けます。 
Is Kinematic有効にすると、オブジェクトは物理特性エンジンによって駆動されませんが、その Transform によってのみ操作できます。 これは、プラットフォームを移したい場合や、HingeJoint を追加したリジッドボディをアニメート化したい場合に便利です。
Interpolateリジッドボディの移動でギクシャクした動きを求めている場合にのみこのオプションのいずれかを試します。
None補間は適用されません。
Interpolate前のフレームのトランスフォームに基づいて、トランスフォームを円滑にします。
Extrapolate次のフレームの推定トランスフォームに基づいて、トランスフォームを円滑にします。
Freeze Rotation有効にすると、GameObject はスクリプトを通じて追加される衝突または力に基づいて回転しません。「transform.Rotate()」を使用した場合のみ回転します。
Collision Detection高速で移動するオブジェクトが、衝突を検出せずに、他のオブジェクトを通過させないようにする場合に使用します。
Discreteシーン内のその他すべてのコライダに対して、個別の衝突検出を使用します。 その他のコライダは、衝突のテスト時に個別衝突検出を使用します。 通常の衝突に使用されます (これはデフォルト値です)。
Continuous動的衝突 (リジッドボディとの) に対する個別衝突検出と、スタティックな MeshColliders との連続衝突検出 (リジッドボディなし) を使用します。 Rigidbodies set to Continuous Dynamic に設定されたリジッドボディは、このリジッドボディへの衝突をテストする際に、連続衝突検出を使用します。 その他のリジッドボディは、個別衝突検出を使用します。 連続衝突検出が衝突を必要とするオブジェクトに使用されます。 (高速のオブジェクトの衝突に関して問題がない場合は、これは、物理特性パフォーマンスに大きく影響するため、個別に設定しておきます)
Continuous Dynamic連続および連続動的衝突に設定されたオブジェクトに対して、連続衝突検出を使用します。 スタティックな MeshColliders との連続衝突検出も使用します (リジッドボディなし)。 その他すべてのコライダに対しては、個別衝突検出を使用します。 高速移動するオブジェクトに使用されます。
Constraintsリジッドボディの動きに関する制限:-
Freeze Position是界の X、Y、Z 軸で移動するリジッドボディを選択的に停止します。
Freeze Rotation是界の X、Y、Z 軸で回転するリジッドボディを選択的に停止します。

詳細

Rigidbody により、GameObject が物理特性エンジンの制御下で動作するようになります。 これにより、現実的な衝突、多様な種類のジョイント、その他のクールな動作へのゲートウェイが開きます。 リジッドボディに力を加えることで、GameObject を操作することによって、 Component を直接調整した場合とは違うルック & フィールを作成します。 一般に、リジッドボディと同じ GameObject のトランスフォームのどちらか一方だけを操作しないでください。

トランスフォームの操作とリジッドボディ間の最大の差は、力を使用するかしないかです。 リジッドボディは、力やトルクを受けることができますが、トランスフォームはできません。 トランスフォームは移動や回転はできますが、物理特性の使用とは異なります。 自分で試した場合は、その顕著な差に気づくでしょう。 リジッドボディに力/トルクを加えると、実際にオブジェクトの位置やトランスフォーム コンポーネントの回転を変更します。 このため、どちら一方だけを使用する必要があります。 物理特性使用中にトランスフォームを変更すると、衝突やその他の計算に問題が生じる場合があります。

リジッドボディは、物理特性エンジンに影響を受ける前に、GameObject に明示的に追加する必要があります。 メニューバーで「Components->Physics->Rigidbody」から選択したオブジェクトにリジッドボディを追加できます。 オブジェクトで物理特性の準備ができました。重力下に置かれ、スクリプティングを介して、力を受けることができますが、 Collider またはジョイントを追加して、正確に希望通りに動作させる必要があります。

親子関係

オブジェクトが物理特性の制御下にある場合、トランスフォームの親が移動する方法から半分独立して移動します。 親を移動すると、リジッドボディの子をそれに沿って引っ張ります。 しかし、リジッドボディは重力および衝突イベントへの対応により、落下していきます。

スクリプティング

リジッドボディをコントロールするため、最初にスクリプトを使用して、力またはトルクを追加します。 「AddForce() と「AddTorque() 」をオブジェクトのリジッドボディで呼び出すことでこれを行います。 物理特性を使用する際は、オブジェクトのトランスフォームを直接買えないようにしてください。

アニメーション

一部の状況で、主にラグドール効果を作成する場合に、アニメーションと物理特性間でオブジェクトのコントロールを切り替える必要があります。 このため、リジッドボディには、「isKinematic 」と付けることができます。 リジッドボディに「isKinematic」と付いている場合、衝突や力、physX のその他の部分の影響を受けません。 This means that you will have to control the object by manipulating the Transform コンポーネントを直接操作することで、オブジェクトをコントロールする必要があるということです。 キネマティック リジッドボディはその他のオブジェクトに影響しますが、これら自体は物理特性の影響を受けません。 例えば、キネマティック オブジェクトに追加されるジョイントは、そこに追加されたその他のリジッドボディを制約し、キネマティック リジッドボディは衝突を通じて、その他のリジッドボディに影響します。

コライダ

コライダは、衝突を発生させるために、リジッドボディと共に追加する必要のある別の種類のコンポーネントです。 2 つのリジッドボディが互いに衝突する場合、物理特性エンジンは、両方のオブジェクトもコライダを追加するまで、衝突を計算しません。 コライダのないリジッドボディは、物理特性シミュレーション中に互いを簡単に通過します。


「コライダはリジッドボディの物理特性の境界を定義します」

「Component->Physics」メニューでコライダを追加します。 詳細については、個々のコライダのコンポーネント リファレンス ページを参照してください。

複合コライダ

複合コライダはプリミティブなコライダの組み合わせによりひとつのコライダとしての挙動を示すものです。便利な場面としては複雑なメッシュをコライダでしようしたいが、Mesh Colliderを使用できないケースです。複合コライダを作成する際はコライダオブジェクトの子オブジェクトを作成し、それから各々の子オブジェクトにプリミティブなコライダを追加します。これにより各々のコライダを別々に容易に配置、回転、拡大縮小することが出来ます。


リアルリティのある複合コライダ

上記の図では、ガンモデルのゲームオブジェクトはリジッドボディがアタッチされてあり、子オブジェクトして複数のプリミティブなコライダを含みます。親のリジッドボディが力により動かされた場合、子コライダが追従して動きます。プリミティブなコライダは環境上にあるMesh Colliderと衝突し、親リジッドボディは自身に加えられた力の作用、子コライダがシーン上他のコライダとの衝突した作用、の双方を加味して軌道が変化します。

Mesh Collider同士は通常では衝突しませんが、Convexをオンにした場合のみ衝突することが出来ます。良くある方法として、動く全てのオブジェクトにはプリミティブなコライダを組み合わせ、動かない背景のオブジェクトにMesh Colliderを使います。

連続衝突検出

連続衝突検出は、高速移動するコライダが互いに通過しないようにする機能です。 これは、通常の(「Discrete」) 衝突検出使用時、オブジェクトが 1 つのフレームでコライダの片側にあり、次のフレームでコライダを通過している場合に発生することがあります。 これを解決するには、高速移動するオブジェクトのリジッドボディで連続衝突検出を有効にできます。 衝突検出モードを「Continuous」に切り替え、リジッドボディがスタティックな (つまり、非リジッドボディ) MeshColliders を通過させないようにします。 衝突検出モードを「Continuous Dynamic」に切り替え、リジッドボディが、衝突検出モードを「Continuous」または「Continuous Dynamic」に設定したその他のサポートされているリジッドボディを通過させないようにします。 連続衝突検出は、Box-、Sphere- および CapsuleCollider でサポートされています。

正しいサイズの使用

GameObject のメッシュのサイズは、リジッドボディの質量よりもはるかに重要です。 リジッドボディが期待通りに動作しなていない場合、ゆっくり移動するか、浮くか、正しく衝突しません。 Unity のデフォルトの単位スケールは、1 単位 = 1 メートルなので、インポートされたメッシュのスケールは維持され、物理特性計算に適用されます。 例えば、倒壊しかけている高層ビルは、積み木で作った塔とはかなり違う形で崩れるため、正確にスケールするには、サイズの異なるオブジェクトをモデル化する必要があります。

人間をモデリングしている場合、Unity では、その人間の身長は約 2メートルになります。 オブジェクトが正しいサイズかどうかを確認するには、デフォルトのキューブと比較します。 GameObject->Create Other->Cube を使用して、キューブを新規作成します。 キューブの高さは、ちょうど 1 メートルになるため、作成している人間は 2 倍の慎重になります。

メッシュ自体を調整できない場合、Project View で選択し、メニューバーから Assets->Import Settings... を選択することで、特定のメッシュ アセットの均一なスケールを変更できます。 ここでは、スケールを変更し、メッシュを再インポートできます。

ゲームで、GameObject を異なるスケールでインスタンス化する必要がある場合、トランスフォームのスケール軸の値を調整しても大丈夫です。 欠点は、物理特性シミュレーションは、オブジェクトのインスタンス化時に更に多くの作業を剃る必要があり、ゲーム内でパフォーマンスの低下を引き起こす可能性があります。 これは大きな損失ではありませんが、他の 2 つのオプションでスケールを仕上げることほど効率的ではありません。 不均一なスケールだと、パレンディング使用時に望まぬ動作を生じる場合があります。 このため、モデリング アプリケーションで正しいスケールでオブジェクトを作成するのが常に最適です。

ヒント

一定力

Constant Force は、Rigidbody に一定力を素早く加えるためのユーティリティです。 これは、高い速度で始めたくないが、代わりに加速していく場合に、ロケットのような 1 回限りのオブジェクトに最適です。


一定の力で前進するロケット

プロパティ

Forceワールド空間で適用される力のベクトル。
Relative Forceオブジェクトのローカル空間で適用される力のベクトル。
Torqueワールド空間で適用されるトルクのベクトル。 オブジェクトが、このベクトル周辺で回転し始めます。 ベクトルが長いほど、回転が速くなります。
Relative Torqueローカル空間で適用されるトルクのベクトル。 オブジェクトが、このベクトル周辺で回転し始めます。 ベクトルが長いほど、回転が速くなります。

詳細

加速しながら前進するロケットを作成するには、Relative Forceを正の Z 軸に沿って設定します。 次に、剛体Dragプロパティを使用して、一定の最大速度を超えないようにします (ドラッグが高くなるほど、最大速度が下がります)。 剛体で、ロケットが常に軌道上にあるよう、重力をオフにします。

ヒント

球体コライダ

Sphere Collider は、基本的な球体形状の衝突プリミティブです。


球体コライダの積み重ね

プロパティ

Material使用する物理マテリアル への参照。物理マテリアルによりコライダが他と衝突したときの物理挙動の条件が定義されます
Is Triggerオンにすると、コライダはイベントのトリガーとなり、物理エンジンが無視されます
Radiusコライダのサイズ。
Centerオブジェクトのローカル空間でのコライダの位置。

詳細

球体コライダは、均一なスケールにサイズ変更できますが、個々の軸に沿ってサイズ変更はされません。 落下する医師や、卓球の球、ビー玉などに最適です。

標準の球体コライダ

コライダはリジッドボディ(剛体)と連動してUnity上での物理挙動を実現します。リジッドボディというのは「オブジェクトを物理法則にしたがって動かす」一方で、コライダはオブジェクトが互いに衝突することを可能に」します。コライダはリッジトボディとは別にオブジェクトにアタッチする必要があります。コライダとリジットボディは両方アタッチされなくとも良いですが、コリジョンの結果オブジェクトを動かすためにはリジッドボディがアタッチされていることは必須です。

二つのコライダ間で衝突が発生し、かつ少なくとも一つのオブジェクトにリジッドボディがアタッチされている場合、 3種類の 衝突 メッセージが 発行されます。これらのイベントはスクリプト作成時にハンドリングすることが出来ますし、独自のビヘイビアを定義することが出来るようになるとともにビルトインのNVIDIA PhysXエンジンを使用するかしないか自由に決めることができます。

トリガー

コライダをトリガーとしてマーキングする別の方法としてインスペクタでIsTrigger属性をチェックボックスにてオンするという方法があります。これによりトリガーは物理エンジンから無視されるようになり、コライダの衝突が発生した場合3種類の トリガー メッセージが 発行されます。トリガーはゲームにおける別イベントを発動するに際して便利です(カットシーンやドアの自動開閉、チュートリアルメッセージの表示、等々。想像を働かせれば様々なパターンで応用できます)

留意しないといけないこととして、二つのトリガーがトリガーイベント衝突時にを発行するためには、片方のトリガーはリジッドボディを含む必要があります。同様にトリガーが通常のコライダと衝突するためには片方のトリガーがリジッドボディを含む必要があります。全てのコライダ種類の一覧についていは図表にまとめてありますので、本項の「応用」セクションの「衝突アクションマトリクス」を参照下さい。

摩擦係数と反射係数

摩擦係数(friction)、反射係数(bouncyness)、やわらかさ(softness)は 物理マテリアル の属性として定義されています。スタンダードアセット のなかに頻繁に使用される物理マテリアルが含まれています。使用する際はPhysic Materialドロップダウンボックスをクリックし、どれかを選択します(例 Ice)。物理マテリアル については自分でカスタムのものを作成することができ、摩擦係数などすべて調整することが出来ます。

合成物コライダ

複合コライダはプリミティブなコライダの組み合わせによりひとつのコライダとしての挙動を示すものです。便利な場面としては複雑なメッシュをコライダでしようしたいが、Mesh Colliderを使用できないケースです。複合コライダを作成する際はコライダオブジェクトの子オブジェクトを作成し、それから各々の子オブジェクトにプリミティブなコライダを追加します。これにより各々のコライダを別々に容易に配置、回転、拡大縮小することが出来ます。


リアルリティのある複合コライダ

上記の図では、ガンモデルのゲームオブジェクトはリジッドボディがアタッチされてあり、子オブジェクトして複数のプリミティブなコライダを含みます。親のリジッドボディが力により動かされた場合、子コライダが追従して動きます。プリミティブなコライダは環境上にあるMesh Colliderと衝突し、親リジッドボディは自身に加えられた力の作用、子コライダがシーン上他のコライダとの衝突した作用、の双方を加味して軌道が変化します。

Mesh Collider同士は通常では衝突しませんが、Convexをオンにした場合のみ衝突することが出来ます。良くある方法として、動く全てのオブジェクトにはプリミティブなコライダを組み合わせ、動かない背景のオブジェクトにMesh Colliderを使います。

ヒント

応用

コライダの組み合わせ

Unity上で複数の異なる衝突の組み合わせがありえる。それぞれのゲームで追求する内容は異なるので、ゲームのタイプによって組み合わせの良し悪しが決まってくる。ゲームで物理挙動を使用している場合、基本的なコライダの種類を理解し、主な働き、使用例、他のオブジェクトとの相互作用について理解を深める必要がある。

スタティックコライダ

リジッドボディを含まないが、コライダを含むゲームオブジェクトについて考えます。これらのオブジェクトは動かない、あるいは動くとしてもわずかであることが望ましいです。これらは背景のオブジェクトとして最適である。リジッドボディと衝突したとしても動きません。

リジッドボディコライダ

リジッドボディとコライダ双方を含むゲームオブジェクトについて考えます。これらのオブジェクトは物理エンジンに影響を受け、加えられた力や衝突によって軌道が変化します。またコライダを含むゲームオブジェクトと衝突させることが出来ます。多くの場合は物理エンジンを使用したい場合の主要なコライダとなります。

キネマティック リジッドボディコライダ

IsKinematicがオンとなっているリジッドボディとコライダ双方を含みゲームオブジェクトについて考えます。このオブジェクトを動かすためには力を加えるのではなくトランスフォーム コンポーネントの値を書き換えて移動させます。スタティックコライダを共通点が多いがコライダを頻繁に動かしたい場合に役立ちます。その他、このオブジェクトを使用するのが適切なシナリオはいくつか考えられます。

このオブジェクトはスタティックコライダにトリガーイベントを発行したい場合に役立つ。トリガーはリジッドボディを含む必要があるためリジッドボディをアタッチしたうえでIsKinematicをオンに設定する。これによりオブジェクトが物理挙動の影響を受けず、必要なときにトリガーイベントを受け取ることが出来るようになる。

キネマティック リジッドボディは簡単にオンオフを切り替えることが出来る。これはラグドール作成に大いに役立ち、たとえばキャクラターがある場面まではアニメーションどおりに動作し、その後に衝突によって爆発や何らかのエフェクトを起こしその後はラグドールの動作をさせたい場合に役立つ。

リジッドボディを長時間動かさない場合、完全にスリープ状態とさせることができる。言い換えると物理挙動のアップデート処理のなかで値が更新されることはなく、位置もそのままとなる。キネマティックリジッドボディコライダを通常のリジッドボディコライダの外に移動する場合、スリープ状態は解除され物理挙動のアップデート処理が再び始まる。つまり動かしたいスタティックコライダが複数あり、その上に異なるオブジェクトを落としたい場合にはキネマティック リジッドボディコライダを使用すべきである。

衝突アクションマトリクス

衝突する2つのオブジェクトの設定によっては同時に複数のアクションが走る可能性がある。以下のチャートにより二つのオブジェクトが衝突する際の動作をアタッチされているコンポーネント等を基準に整理しました。いくつかの組み合わせにおいては片方のオブジェクトのみ衝突の影響を受けるので、原則として「オブジェクトにリジッドボディがなければ物理挙動もない」ということをよく頭に入れておく必要がある。

衝突により衝突メッセージを受け取るか?
 Static ColliderRigidbody ColliderKinematic
Rigidbody Collider
Static
Trigger Collider
Rigidbody
Trigger Collider
Kinematic Rigidbody
Trigger Collider
Static Collider Y    
Rigidbody ColliderYYY   
Kinematic Rigidbody Collider Y    
Static Trigger Collider      
Rigidbody Trigger Collider      
Kinematic Rigidbody Trigger Collider      
衝突によりトリガーメッセージは発行されるか?
 Static ColliderRigidbody ColliderKinematic
Rigidbody Collider
Static
Trigger Collider
Rigidbody
Trigger Collider
Kinematic Rigidbody
Trigger Collider
Static Collider    YY
Rigidbody Collider   YYY
Kinematic Rigidbody Collider   YYY
Static Trigger Collider YY YY
Rigidbody Trigger ColliderYYYYYY
Kinematic Rigidbody Trigger ColliderYYYYYY

Layer-Based Collision Detection

Unity3.XではLayer-Based Collision Detectionが機能として追加されましたので、すべてのレイヤーの組み合わせにおいて、どのレイヤーの組み合わせでは衝突が発生するかオブジェクトに設定を行うことができます。詳細情報についてはここ をクリックし参照のこと。

ボックスコライダ

ボックスコライダは箱の型をした基本型コリジョンプリミティブです。


山積みされたボックスコライダ

プロパティ

Material使用する物理マテリアル への参照。物理マテリアルによりコライダが他と衝突したときの物理挙動の条件が定義されます
Is Triggerオンにすると、コライダはイベントのトリガーとなり、物理エンジンが無視されます
Sizeコライダの大きさ(X、Y、Z方向)
Centerオブジェクトのローカル座標系におけるコライダの位置

詳細

ボックスコライダは様々な形の直方体に変形することが出来ます。ドア、壁、床、等々に最適です。さらにラグドールの胴体や車など乗り物の車体でも効果的です。言うまでもなく簡単な箱や壷にもピッタリです。


標準的なボックスコライダ

コライダはリジッドボディ(剛体)と連動してUnity上での物理挙動を実現します。リジッドボディというのは「オブジェクトを物理法則にしたがって動かす」一方で、コライダはオブジェクトが互いに衝突することを可能に」します。コライダはリッジトボディとは別にオブジェクトにアタッチする必要があります。コライダとリジットボディは両方アタッチされなくとも良いですが、コリジョンの結果オブジェクトを動かすためにはリジッドボディがアタッチされていることは必須です。

二つのコライダ間で衝突が発生し、かつ少なくとも一つのオブジェクトにリジッドボディがアタッチされている場合、 3種類の 衝突 メッセージが 発行されます。これらのイベントはスクリプト作成時にハンドリングすることが出来ますし、独自のビヘイビアを定義することが出来るようになるとともにビルトインのNVIDIA PhysXエンジンを使用するかしないか自由に決めることができます。

トリガー

コライダをトリガーとしてマーキングする別の方法としてインスペクタでIsTrigger属性をチェックボックスにてオンするという方法があります。これによりトリガーは物理エンジンから無視されるようになり、コライダの衝突が発生した場合3種類の トリガー メッセージが 発行されます。トリガーはゲームにおける別イベントを発動するに際して便利です(カットシーンやドアの自動開閉、チュートリアルメッセージの表示、等々。想像を働かせれば様々なパターンで応用できます)

留意しないといけないこととして、二つのトリガーがトリガーイベント衝突時にを発行するためには、片方のトリガーはリジッドボディを含む必要があります。同様にトリガーが通常のコライダと衝突するためには片方のトリガーがリジッドボディを含む必要があります。全てのコライダ種類の一覧についていは図表にまとめてありますので、本項の「応用」セクションの「衝突アクションマトリクス」を参照下さい。

摩擦係数と反射係数

摩擦係数(friction)、反射係数(bouncyness)、やわらかさ(softness)は 物理マテリアル の属性として定義されています。スタンダードアセット のなかに頻繁に使用される物理マテリアルが含まれています。使用する際はPhysic Materialドロップダウンボックスをクリックし、どれかを選択します(例 Ice)。物理マテリアル については自分でカスタムのものを作成することができ、摩擦係数などすべて調整することが出来ます。

Mesh Collider

Mesh Colliderメッシュアセット を取り、そのメッシュに基づいてコライダを構築します。複雑なメッシュの場合、プリミティブを使用した衝突検出するよりも遥かに正確です。ConvexがオンとなっているMesh Colliderは、他のMesh Colliderと衝突することができます。


階段状の物体で使用される'Mesh Collider

プロパティ

Material使用する物理マテリアル への参照。物理マテリアルによりコライダが他と衝突したときの物理挙動の条件が定義されます
Is Triggerオンにすると、コライダはイベントのトリガーとなり、物理エンジンが無視されます
Mesh衝突判定に使用するメッシュ(への参照)。
Smooth Sphere Collisionsオンのとき、衝突メッシュの法線を滑らかにする。球が滑らかな表面(例 ある地形上を滑らかに移動させるため、地形のエッジを立てずにを転がす。)の上を転がす場合、有効にすべきです。
Convexオンにした場合、MeshColliderは他のMeshColliderと衝突します。ConvexオンのMeshColliderは三角形面数の上限255に制限されています。

詳細

MeshColliderの衝突判定は形状をゲームオブジェクトにアタッチされたMesh から構築されており、アタッチされたTransform のプロパティを読み込んでPositionとScaleを正しく設定します。

衝突メッシュは、バックフェースカリングを使用しています。オブジェクトが表示上バックフェースカリングされたメッシュと衝突した場合、物理的に衝突することもありません。

MeshColliderを使用した場合、いくつかの制限があります。通常、2つのMeshColliderは互いに衝突することができません。すべてのMeshColliderはどのようなプリミティブコライダと衝突することができます。メッシュのConvexがオンの場合、他のMeshColliderと衝突することができます。

コライダはリジッドボディ(剛体)と連動してUnity上での物理挙動を実現します。リジッドボディというのは「オブジェクトを物理法則にしたがって動かす」一方で、コライダはオブジェクトが互いに衝突することを可能に」します。コライダはリッジトボディとは別にオブジェクトにアタッチする必要があります。コライダとリジットボディは両方アタッチされなくとも良いですが、コリジョンの結果オブジェクトを動かすためにはリジッドボディがアタッチされていることは必須です。

二つのコライダ間で衝突が発生し、かつ少なくとも一つのオブジェクトにリジッドボディがアタッチされている場合、 3種類の 衝突 メッセージが 発行されます。これらのイベントはスクリプト作成時にハンドリングすることが出来ますし、独自のビヘイビアを定義することが出来るようになるとともにビルトインのNVIDIA PhysXエンジンを使用するかしないか自由に決めることができます。

トリガー

コライダをトリガーとしてマーキングする別の方法としてインスペクタでIsTrigger属性をチェックボックスにてオンするという方法があります。これによりトリガーは物理エンジンから無視されるようになり、コライダの衝突が発生した場合3種類の トリガー メッセージが 発行されます。トリガーはゲームにおける別イベントを発動するに際して便利です(カットシーンやドアの自動開閉、チュートリアルメッセージの表示、等々。想像を働かせれば様々なパターンで応用できます)

留意しないといけないこととして、二つのトリガーがトリガーイベント衝突時にを発行するためには、片方のトリガーはリジッドボディを含む必要があります。同様にトリガーが通常のコライダと衝突するためには片方のトリガーがリジッドボディを含む必要があります。全てのコライダ種類の一覧についていは図表にまとめてありますので、本項の「応用」セクションの「衝突アクションマトリクス」を参照下さい。

摩擦係数と反射係数

摩擦係数(friction)、反射係数(bouncyness)、やわらかさ(softness)は 物理マテリアル の属性として定義されています。スタンダードアセット のなかに頻繁に使用される物理マテリアルが含まれています。使用する際はPhysic Materialドロップダウンボックスをクリックし、どれかを選択します(例 Ice)。物理マテリアル については自分でカスタムのものを作成することができ、摩擦係数などすべて調整することが出来ます。

ヒント

物理特性マテリアル

Physic Material を使用して、衝突するオブジェクトの摩擦や跳ね返り効果を調整できます。

物理特性マテリアルを作成するには、メニューバーから Assets->Create->Physic Material を選択します。 次に、シーン内の Collider にプロジェクト ビューから物理特性マテリアルをドラッグします。


物理特性マテリアル Inspector'

プロパティ

Dynamic Frictionすでに移動中の場合に使用される摩擦。 通常は、0 から 1 の間の値を使用します。0 の場合、氷のような感じになります。1 の場合、多くの力または重力がオブジェクトを押さない限り、非常に素早く停止します。
Static Frictionオブジェクトが面上で静止したままの場合に使用される摩擦。 通常は、0 から 1 の間の値を使用します。0 の場合、氷のような感じになります。1 の場合、オブジェクトをかなり強く動かそうとします。
Bounciness面の跳ね返し度合い。 0 の場合、跳ね返りません。 1 の場合はエネルギー損失なしで跳ね返ります。
Friction Combine Mode2 つの衝突するオブジェクトの摩擦がどのように結合されるか。
Average2 つの摩擦値が平均化されます。
Min2 つの摩擦値の内最小の値が使用されます。
Max2 つの摩擦値の内最大の値が使用されます。
Multiply2 つの摩擦値が互いに乗算されます。
Bounce Combine2 つの衝突するオブジェクトの跳ね返し度合いがどのように結合されるか。 Friction Combine Mode と同じです。
Friction Direction 2異方性の方向。 この方向が 0 でない場合、異方性摩擦が有効になります。 Dynamic Friction 2 と Static Friction 2 Friction Direction 2に沿って適用されます。
Dynamic Friction 2異方性摩擦を有効にすると、DynamicFriction2 が、Friction Direction 2 に沿って適用されます。
Static Friction 2異方性摩擦を有効にすると、StaticFriction2 が、Friction Direction 2 に沿って適用されます。

詳細

摩擦は、面が互いに外れるのを防ぐ量です。 この値は、オブジェクトを重ねる時に重要です。 摩擦には動的と静的の 2 種類があります。 Static frictionは、オブジェクトが静止している際に使用されます。 これは、オブジェクトが動き始めるのを防ぎます。 強い力が加えられると、オブジェうとは動き始めます。 動き始めると、Dynamic Frictionが作用し始めます。 オブジェクトが別のオブジェクトと接触中に、Dynamic Frictionはオブジェクトを減速させようとします。

ヒント

ヒンジ ジョイント

Hinge Joint は、2 つの Rigidbodies をグループ化し、互いにヒンジで連結されているかのように動くよう制約します。 ドアに最適ですが、鎖や振り子などをモデル化するのにも使用できます。


ヒンジ ジョイント Inspector'

プロパティ

Connected Bodyジョイントが依存するリジッドボディへのオプションの参照。 設定しないと、ジョイントはワールドに接続します。
Anchorボディが揺れる中心となる軸の位置。 この位置はローカルなスペースで定義されます。
Axisボディが揺れる中心となる軸の方向。 この方向はローカルなスペースで定義されます。
Use Springスプリングは、リジッドボディをその連結されたボディと比較して、一定の角度に到達させます。
SpringUse Springを有効にした場合に使用されるスプリングのプロパティ。
Springオブジェクトが前述の位置に移動するのに出す力。
Damperこの値が高いほど、オブジェクトの速度は低下します。
Target Positionスプリングの対象角度。 スプリングは、°で測定されたこの角度に向けて引っ張られます。
Use Motorモーターはオブジェクトを回転させます。
MotorUse Motorを有効にした場合に使用されるモーターのプロパティ。
Target Velocityオブジェクトが達成しようとする速度。
Forceオブジェクトが前述の速度を達成するのに適用される力。
Free Spin有効にすると、モーターは回転にブレーキをかけるのに使用されず、加速にのみ使用されます。
Use Limits有効にすると、MinMax値内にヒンジの角度が制限されます。
Limits Use Limitsを有効にした場合に使用される制限のプロパティ。
Min回転が到達できる最低角度。
Max回転が到達できる最高角度。
Min Bounce最小の停止に到達した際にオブジェクトが跳ね返る量。
Max Bounce最大の停止に到達した際にオブジェクトが跳ね返る量。
Break Forceこのジョイントが分解するのに適用される必要のある力。
Break Torqueこのジョイントが分解するのに適用される必要のあるトルク。

詳細

1 つのヒンジを GameObject に適用する必要があります。 このヒンジは、Anchorプロパティで指定した点で回転し、指定したAxisプロパティ周辺で移動します。 ジョイントのConnected BodyプロパティにGameObject を割り当てる必要はありません。 ジョイントの Transform を追加したオブジェクトのトランスフォームに依存させたい場合にのみ、GameObject をConnected Bodyプロパティに割り当てる必要があります。

ドアのヒンジがどのように機能するかを考えましょう。 この場合のAxisは上で、Y 軸に沿って正になります。 Anchorは、ドアと壁の間の交差部のどこかに置かれます。 ジョイントは、デフォルトでワールドに連結されるので、壁をConnected Bodyに割当てる必要はありません。

次は、ドギー ドアのヒンジについて考えましょう。 ドギー ドアのAxisは横で、相対的な X 軸に沿って正になります。 メイン ドアをConnected Bodyに割当てる必要があるため、ドギー ドアのヒンジは、メイン ドアのリジッドボディに依存します。

複数のヒンジ ジョイントを連結して、鎖を作成することも出来ます。 鎖の各連結部分にジョイントを追加して、次の連結部をConnected Bodyとして追加します。

ヒント

スプリング ジョイント

Spring Joint は、2 つの Rigidbody をグループ化し、スプリングで連結されているかのように動くよう制約します。


スプリング ジョイント Inspector

プロパティ

Connected Bodyジョイントが依存する剛体へのオプションのオプションの参照。
Anchorジョイントの中心を定義するオブジェクトのローカル空間での位置 (静止時)。 これは、オブジェクトがびょうがされる先の点ではありません。
XX 軸に沿ったジョイントのローカル点の位置。
YY 軸に沿ったジョイントのローカル点の位置。
ZZ 軸に沿ったジョイントのローカル点の位置。
Springスプリングの強さ。
Damper有効な場合にスプリングを減らす量。
Min Distanceこの距離を超えると、スプリングが起動しません。
Max Distanceこの距離を未満の場合、スプリングが起動しません。
Break Forceこのジョイントが分解するのに適用される必要のある力。
Break Torqueこのジョイントが分解するのに適用される必要のあるトルク。

詳細

スプリング ジョイントにより、剛体 GameObject は特定の目標:位置に引っ張られます。 この位置は、別のリジッド ボディの GameObject か世界のいずれかになります。 GameObject がこの目標位置から更に離れると、スプリング ジョイントがその元の目標''位置に引き戻す力を加えます。 これにより、ボムバンドやパチンコに非常に似た効果を作成できます。

スプリングの目標位置は、スプリング ジョイントを作成するか、再生モードに入った時に、AnchorからConnected Body(または世界)までの相対位置によって決定されます。 これにより、ジョイントしたキャラクターまたはオブジェクトをエディタで設定する際ににスプリング ジョイントが効率的になりますが、スプリングのプッシュ / プル 動作をスクリプティングを通じて、ランタイムで作成するのがより難しくなります。 スプリング ジョイントを使用して、主に GameObject の位置を変更したい場合、リジッドボディで空の GameObject を作成し、それをジョイントされたオブジェクトのConnected Rigidbodyに設定します。 次にスクリプティングで、Connected Rigidbodyの位置を変更でき、期待通りにスプリングが移動します。

Connected Rigidbody

スプリングとダンパー

スプリングは、オブジェクトをその目的の位置に引っ張り戻す力の強さです。 0 の場合、オブジェクトに引っ張る力はかからず、スプリング ジョイントがかからないかのよう動作します。

Damperは、Springの力に対する抵抗力です。 この値が低いほど、オブジェクトのスプリング力は強くなります。 Damperが増えると、ジョイントによる跳ね返りの量は減ります。

Min & Max Distance

オブジェクトの位置がMinMax Distances間にある場合、ジョイントはオブジェクトに適用されません。 この位置を、有効にするジョイントに対して、これらの値外に移動させる必要があります。

ヒント

iOS

iOS physics optimization hints can be found here.

Page last updated: 2011-01-12



RandomNumbers

Randomly chosen items or values are important in many games. This sections shows how you can use Unity's built-in random functions to implement some common game mechanics.

Choosing a Random Item from an Array

Picking an array element at random boils down to choosing a random integer between zero and the array's maximum index value (which is equal to the length of the array minus one). This is easily done using the built-in Random.Range function:-

var element = myArray[Random.Range(0, myArray.Length)];

Note that Random.Range returns a value from a range that includes the first parameter but excludes the second, so using myArray.Length here gives the correct result.

Choosing Items with Different Probabilities

Sometimes, you need to choose items at random but with some items more likely to be chosen than others. For example, an NPC may react in several different ways when it encounters a player:-

You can visualise these different outcomes as a paper strip divided into sections each of which occupies a fraction of the strip's total length. The fraction occupied is equal to the probability of that outcome being chosen. Making the choice is equivalent to picking a random point along the strip's length (say by throwing a dart) and then seeing which section it is in.

In the script, the paper strip is actually an array of floats that contain the different probabilities for the items in order. The random point is obtained by multiplying Random.value by the total of all the floats in the array (they need not add up to 1; the significant thing is the relative size of the different values). To find which array element the point is "in", firstly check to see if it is less than the value in the first element. If so, then the first element is the one selected. Otherwise, subtract the first element's value from the point value and compare that to the second element and so on until the correct element is found. In code, this would look something like the following:-

function Choose(probs: float[]) {
	var total = 0;

	for (elem in probs) {
		total += elem;
	}

	var randomPoint = Random.value * total;

	for (i = 0; i < probs.Length; i++) {
		if (randomPoint < probs[i])
			return i;
		else
			randomPoint -= probs[i];
	}

	return probs.Length - 1;
}

Note that the final return statement is necessary because Random.value can return a result of 1. In this case, the search will not find the random point anywhere. Changing the line

if (randomPoint < probs[i])

...to a less-than-or-equal test would avoid the extra return statement but would also allow an item to be chosen occasionally even when its probability is zero.

Shuffling a List

A common game mechanic is to choose from a known set of items but have them arrive in random order. For example, a deck of cards is typically shuffled so they are not drawn in a predictable sequence. You can shuffle the items in an array by visiting each element and swapping it with another element at a random index in the array:-

function Shuffle(deck: int[]) {
	for (i = 0; i < deck.Length; i++) {
		var temp = deck[i];
		var randomIndex = Random.Range(0, deck.Length);
		deck[i] = deck[randomIndex];
		deck[randomIndex] = temp;
	}
}

Choosing from a Set of Items Without Repetition

A common task is to pick a number of items randomly from a set without picking the same one more than once. For example, you may want to generate a number of NPCs at random spawn points but be sure that only one NPC gets generated at each point. This can be done by iterating through the items in sequence, making a random decision for each as to whether or not it gets added to the chosen set. As each item is visited, the probability of its being chosen is equal to the number of items still needed divided by the number still left to choose from.

As an example, suppose that ten spawn points are available but only five must be chosen. The probability of the first item being chosen will be 5 / 10 or 0.5. If it is chosen then the probability for the second item will be 4 / 9 or 0.44 (ie, four items still needed, nine left to choose from). However, if the first was not chosen then the probability for the second will be 5 / 9 or 0.56 (ie, five still needed, nine left to choose from). This continues until the set contains the five items required. You could accomplish this in code as follows:-

var spawnPoints: Transform[];

function ChooseSet(numRequired: int) {
	var result = new Transform[numRequired];

	var numToChoose = numRequired;

	for (numLeft = spawnPoints.Length; numLeft > 0; numLeft--) {
		// Adding 0.0 is simply to cast the integers to float for the division.
		var prob = numToChoose + 0.0 / numLeft + 0.0;

		if (Random.value <= prob) {
			numToChoose--;
			result[numToChoose] = spawnPoints[numLeft - 1];

			if (numToChoose == 0)
				break;
		}
	}

	return result;
}

Note that although the selection is random, items in the chosen set will be in the same order they had in the original array. If the items are to be used one at a time in sequence then the ordering can make them partly predictable, so it may be necessary to shuffle the array before use.

Random Points in Space

A random point in a cubic volume can be chosen by setting each component of a Vector3 to a value returned by Random.value:-

var randVec = Vector3(Random.value, Random.value, Random.value);

This gives a point inside a cube with sides one unit long. The cube can be scaled simply by multiplying the X, Y and Z components of the vector by the desired side lengths. If one of the axes is set to zero, the point will always lie within a single plane. For example, picking a random point on the "ground" is usually a matter of setting the X and Z components randomly and setting the Y component to zero.

When the volume is a sphere (ie, when you want a random point within a given radius from a point of origin), you can use Random.insideUnitSphere multiplied by the desired radius:-

var randWithinRadius = Random.insideUnitSphere * radius;

Note that if you set one of the resulting vector's components to zero, you will *not* get a correct random point within a circle. Although the point is indeed random and lies within the right radius, the probability is heavily biased toward the edge of the circle and so points will be spread very unevenly. You should use Random.insideUnitCircle for this task instead:-

var randWithinCircle = Random.insideUnitCircle * radius;

Page last updated: 2011-09-13



Particle Systems

注意:これは新パーティクルシステム(Shuriken)についてのドキュメントです。旧パーティクルシステムについては Legacy Particle System を参照して下さい。

パーティクルシステム(Shuriken)

Unityの中のパーティクルシステムは、大量の煙、蒸気、炎やその他の大気圏エフェクトを作るために使用されます。

新規にParticle Systemを作成するにはParticle Systemゲームオブジェクトを作成(メニューでGameObject -> Create Other -> Particle Systemを選択)するか、空のGameObjectを作成して ParticleSystemコンポーネントを追加します。(メニューでComponent->Effectsを選択)

The Particle System Inspector (Shuriken)

The Particle System Inspector shows one particle system at a time (the currently selected one), and it looks like this:

Individual particle systems can take on various complex behaviors by using Modules.

They can also be extended by being grouped together into Particle Effects.

If you press the button Open Editor ..., this will open up the Extended Particle Editor, that shows all of the particle systems under the same root in the scene tree. For more information on particle system grouping, see the section on Particle Effects.

シーンビューの編集

パーティクルシステムを作成および編集するときInspectorまたは拡張されたParticle Editorを使用し、その変更内容はSceneViewに反映されます。シーンビューにはPreview Panelがあり、現在選択したParticle Effectのプレイバックを編集モードで制御することが出来、アクションとしてplaypausestopおよびscrubbing playback timeが用意されています。

'Playback Time''ラベルをドラッグすることにより、プレイバック時間をこすって調整することができます。全てのプレイバックコントロールにはショートカットキーがあり、それらはPreferencesウィンドウ にてカスタム設定できます。

パーティクルシステムCurve Editor

MinMax Curve

パーティクルシステム モジュールのプロパティの多くは時間の経過とともに値が変化します。そういう変更は MinMax Curves(最大最小カーブ)を通じて表現できます。 これらの時間により活性化するプロパティ(たとえばsize およびspeed)、は右側にプルダウンメニューがあり、それを選択することが出来ます。 Attach:MinMaxDropDown.png Δ

Constant: プロパティの値が時間とともに変化しないので、Curve Editorに表示されません。

Random between constants: プロパティの値は、2つの定数間のランダム値に設定されます。

Curve:プロパティの値が時間とともにCurve Editorのカーブに基づいて変化します。

カーブでアニメーションされたプロパティ

Random between curves: プロパティの値は、2つの最大、最小のカーブの間でランダムに設定され、値は時とともに生成されたカーブに基づいて変化します。

Random Between Two Curvesとして活性化されたプロパティ。

Curve Editorで0とDurationプロパティで指定した値の間で時間を"X"'軸上に散らせたうえで、"Y"軸は各々の時間における活性化されたプロパティの値を示します。 "Y"軸の範囲はCurve Editor右上隅にある数字フィールドで調整することができます。現時点で、Curve Editorはパーティクルシステムの全てのカーブを同じウィンドウで表示します。

同じCurve Editorで複数のカーブを表示。

なお、右下隅の" - "は現在選択されているカーブを削除する一方で"+"はそれを"最適化"します(これにより、高々3つのキーをもつパラメータ化されたカーブとなります)。

3D空間でのベクトルを表現する活性化されたプロパティにはTripleMinMaxカーブを用意していて、これは単にx軸、y軸、z軸を横に並べたシンプルなカーブであり、次のように表示されます:

Curve Editorで複数のカーブの管理

Curve Editorで混乱を防ぐためには、インスペクタでそクリックすることで、カーブのオンとオフに切り替えることが可能です。パーティクルシステムのCurve Editorを、あなたがこのようなものが表示されるはずです。その後Particle System Curvesタイトルバー上で右クリックして、インスペクタから切り離すことができます:

他のウィンドウと同様で、Curve Editorのウィンドウをドックすることが出来ます。

カーブの働きに関する情報については、Curve Editorドキュメンテーションを参照のこと。

パーティクルシステムの色およびグラデーション(Shuriken)

色を扱うプロパティについて、 Particle SystemColor and Gradient Editor``を使用しています。 それはCurve Editor と同じような働きをします。

カラーベースのプロパティは右側にプルダウンメニューがあり、好きな方法を選択することが出来ます。

Color: 色は常に同じになります(Color Picker を参照してください)。

Gradient: グラデーション(RGBA)はGradient Editor で編集したとおりに、時間とともに変化します。

Random Between Two Colors: 色は時間とともに変化し、Color Picker で指定した二つの値の間でランダムに選択されます。

Random Between Two Gradients: グラデーション(RGBA)はGradient Editor で指定した二つの値の間でランダムに選択され、時間とともに変化します。

Page last updated: 2012-11-13



Particle System Curve Editor

MinMax Curve

パーティクルシステム モジュールのプロパティの多くは時間の経過とともに値が変化します。そういう変更は MinMax Curves(最大最小カーブ)を通じて表現できます。 これらの時間により活性化するプロパティ(たとえばsize およびspeed)、は右側にプルダウンメニューがあり、それを選択することが出来ます。 Attach:MinMaxDropDown.png Δ

Constant: プロパティの値が時間とともに変化しないので、Curve Editorに表示されません。

Random between constants: プロパティの値は、2つの定数間のランダム値に設定されます。

Curve:プロパティの値が時間とともにCurve Editorのカーブに基づいて変化します。

カーブでアニメーションされたプロパティ

Random between curves: プロパティの値は、2つの最大、最小のカーブの間でランダムに設定され、値は時とともに生成されたカーブに基づいて変化します。

Random Between Two Curvesとして活性化されたプロパティ。

Curve Editorで0とDurationプロパティで指定した値の間で時間を"X"'軸上に散らせたうえで、"Y"軸は各々の時間における活性化されたプロパティの値を示します。 "Y"軸の範囲はCurve Editor右上隅にある数字フィールドで調整することができます。現時点で、Curve Editorはパーティクルシステムの全てのカーブを同じウィンドウで表示します。

同じCurve Editorで複数のカーブを表示。

なお、右下隅の" - "は現在選択されているカーブを削除する一方で"+"はそれを"最適化"します(これにより、高々3つのキーをもつパラメータ化されたカーブとなります)。

3D空間でのベクトルを表現する活性化されたプロパティにはTripleMinMaxカーブを用意していて、これは単にx軸、y軸、z軸を横に並べたシンプルなカーブであり、次のように表示されます:

Curve Editorで複数のカーブの管理

Curve Editorで混乱を防ぐためには、インスペクタでそクリックすることで、カーブのオンとオフに切り替えることが可能です。パーティクルシステムのCurve Editorを、あなたがこのようなものが表示されるはずです。その後Particle System Curvesタイトルバー上で右クリックして、インスペクタから切り離すことができます:

他のウィンドウと同様で、Curve Editorのウィンドウをドックすることが出来ます。

カーブの働きに関する情報については、Curve Editorドキュメンテーションを参照のこと。

Page last updated: 2012-11-22



Particle System Color Editor

色を扱うプロパティについて、 Particle SystemColor and Gradient Editor``を使用しています。 それはCurve Editor と同じような働きをします。

カラーベースのプロパティは右側にプルダウンメニューがあり、好きな方法を選択することが出来ます。

Color: 色は常に同じになります(Color Picker を参照してください)。

Gradient: グラデーション(RGBA)はGradient Editor で編集したとおりに、時間とともに変化します。

Random Between Two Colors: 色は時間とともに変化し、Color Picker で指定した二つの値の間でランダムに選択されます。

Random Between Two Gradients: グラデーション(RGBA)はGradient Editor で指定した二つの値の間でランダムに選択され、時間とともに変化します。

Page last updated: 2012-11-21



Particle System Gradient Editor

Gradient editor

The Gradient Editor is used for describing change of gradient with time. It animates the color (RGB-space, described by the markers at the bottom), and Alpha (described by the markers at the top).

You can add new markers for Alpha values by clicking near the top of the rectangle, and new ticks for Color by clicking near the bottom. The markers can be intuitively dragged along the timeline.

If an Alpha tick is selected, you can edit the value for that tick by dragging the alpha value.

If a Color tick is selected, the color can be modified by double clicking on the tick or clicking on the color bar.

To remove a marker, just drag it off the screen.

Page last updated: 2012-08-28



Particle System Inspector

The Particle System Inspector (Shuriken)

The Particle System Inspector shows one particle system at a time (the currently selected one), and it looks like this:

Individual particle systems can take on various complex behaviors by using Modules.

They can also be extended by being grouped together into Particle Effects.

If you press the button Open Editor ..., this will open up the Extended Particle Editor, that shows all of the particle systems under the same root in the scene tree. For more information on particle system grouping, see the section on Particle Effects.

Page last updated: 2012-08-28



Particle System Modules Intro

A Particle System consists of a predefined set of modules that can be enabled and disabled. These modules describe the behavior of particles in an individual particle system.

Initially only a few modules are enabled. Addding or removing modules changes the behavior of the particle system. You can add new modules by pressing the (+) sign in the top-right corner of the Particle System Inspector. This pops up a selection menu, where you can choose the module you want to enable.

An alternative way to work with modules is to select "Show All Modules", at which point all of the modules will show up in the inspector.

Then you can enable / disable modules directly from the inspector by clicking the checkbox to the left.

Most of the properties are controllable by curves (see Curve Editor). Color properties are controlled via gradients which define an animation for color (see Color Editor).

For details on individual modules and their properties, see Particle System Modules

Page last updated: 2012-10-25



Particle System Modules40

This page is dedicated to individual modules and their properties. For introduction to modules see this page

Initial Module

This module is always present, cannot be removed or disabled.

DurationThe duration the Particle System will be emitting particles.
LoopingIs the Particle System looping.
PrewarmOnly looping systems can be prewarmed which means that the Particle System will have emitted particles at start as if it had already emitted particles one cycle.
Start DelayDelay in seconds that this Particle System will wait before emitting particles. Note prewarmed looping systems cannot use a start delay.
Start LifetimeThe lifetime of particles in seconds (see MinMaxCurve).
Start SpeedThe speed of particles when emitted.(see MinMaxCurve).
Start SizeThe size of particles when emitted. (see MinMaxCurve).
Start RotationThe rotation of particles when emitted. (see MinMaxCurve).
Start ColorThe color of particles when emitted. (see MinMaxGradient).
Gravity ModifierThe amount of gravity that will affect particles during their lifetime.
Inherit VelocityFactor for controlling the amount of velocity the particles should inherit of the transform of the Particle System (for moving Particle Systems).
Simulation SpaceSimulate the Particle System in local space or world space.
Play On AwakeIf enabled the Particle System will automatically start when it's created.
Max ParticlesMax number of particles the Particle System will emit.

Emission Module

Controls the rate of particles being emitted and allows spawning large groups of particles at certain moments (over Particle System duration time). Useful for explosions when a bunch of particles need to be created at once.

RateAmount of particles emitted over Time (per second) or Distance (per meter). (see MinMaxCurve)
Bursts (Time option only)Add bursts of particles that occur within the duration of the Particle System
Time and Number of ParticlesSpecify time (in seconds within duration) that a specified amount of particles should be emitted. Use the + and - for adjusting number of bursts.

Shape Module

Defines the shape of the emitter: Sphere, Hemishpere, Cone, Box and Mesh. Can apply initial force along the surface normal or random direction.

Sphere 
RadiusRadius of the sphere (can also be manipulated by handles in the Scene View)
Emit from ShellEmit from shell of the sphere. If disabled, particles will be emitted from the volume of the sphere.
Random DirectionShould particles have have a random direction when emitted or a direction along the surface normal of the sphere
Hemisphere 
RadiusRadius of the hemisphere (can also be manipulated by handles in the Scene View)
Emit from ShellEmit from shell of the hemisphere. If disabled particles will be emitted from the volume of the hemisphere.
Random DirectionShould particles have have a random direction when emitted or a direction along the surface normal of the hemisphere.
Cone 
AngleAngle of the cone. If angle is 0 then particles will be emitted in one direction. (can also be manipulated by handles in the Scene View)
RadiusA value larger than 0 when basically create a capped cone, using this will change emission from a point to a disc.(can also be manipulated by handles in the Scene View)
Emit FromDetermines where emission originates from. Possible values are Base, Base Shell, Volume and Volume Shell.
Box 
Box XScale of box in X (can also be manipulated by handles in the Scene View)
Box YScale of box in Y (can also be manipulated by handles in the Scene View)
Box ZScale of box in Z (can also be manipulated by handles in the Scene View)
Random DirectionShould particles have have a random direction when emitted or a direction along the Z-axis of the box
Mesh 
TypeParticles can be emitted from either Vertex, Edge or Triangle
MeshSelect Mesh that should be used as emission shape
Random DirectionShould particles have have a random direction when emitted or a direction along the surface of the mesh

Velocity Over Lifetime Module

Directly animates velocity of the particle. Mostly useful for particles which has complex physical, but simple visual behavior (like smoke with turbulence and temperature loss) and has little interaction with physical world.

XYZUse either constant values for curves or random between curves for controlling the movement of the particles. See MinMaxCurve.
SpaceLocal / World: Are the velocity values in local space or world space

Limit Velocity Over Lifetime Module

Basically can be used to simulate drag. Dampens or clamps velocity, if it is over certain threshold. Can be configured per axis or per vector length.

Separate AxisUse for setting per axis control.
SpeedSpecify magnitude as constant or by curve that will limit all axes of velocity.
XYZControl each axis seperately. See MinMaxCurve.
Dampen(0-1) value that controls how much the exceeding velocity should be dampened. For example, a value of 0.5 will dampen exceeding velocity by 50%

Force Over Lifetime Module

XYZUse either constant values for curves or random between curves for controlling the force applied to the particles. See MinMaxCurve.
RandomizeRandomize the force applied to the particles every frame

Color Over Lifetime Module

ColorControls the color of each particle during its lifetime. If some particles have a shorter lifetime than others, they will animate faster. Use constant color, random between two colors, animate it using gradient or specify a random color using two gradients (see Gradient). Note that this colour will be multiplied by the value in the Start Color property - if the Start Color is black then Color Over Lifetime will not affect the particle.
Color ScaleUse the color scale for easy adjustment of color or gradient.

Color By Speed Module

Animates particle color based on its speed. Remaps speed in the defined range to a color.

ColorColor used for remapping of speed. Use gradients for varying colors. See MinMaxGradient.
Color ScaleUse the color scale for easy adjustment of color or gradient.
Speed RangeThe min and max values for defining the speed range which is used for remapping a speed to a color.

Size Over Lifetime Module

SizeControls the size of each particle during its lifetime. Use constant size, animate it using a curve or specify a random size using two curves. See MinMaxCurve.

Size By Speed Module

SizeSize used for remapping of speed. Use curves for varying sizes. See MinMaxCurve.
Speed RangeThe min and max values for defining the speed range which is used for remapping a speed to a size.

Rotation Over Lifetime Module

Specify values in degrees.

Rotational SpeedControls the rotational speed of each particle during its lifetime. Use constant rotational speed, animate it using a curve or specify a random rotational speed using two curves. See MinMaxCurve.

Rotation By Speed Module

Rotational SpeedRotational speed used for remapping of a particle's speed. Use curves for varying rotational speeds. See MinMaxCurve.
Speed RangeThe min and max values for defining the speed range which is used for remapping a speed to a rotational speed.

External Forces Module

MultiplierScale factor that determines how much the particles are affected by wind zones (i.e., the wind force vector is multiplied by this value).

Collision Module

Set up collisions for the particles of this Particle System. World and planar collisions are supported. Planar collision is very efficient for simple collision detection. Planes are set up by referencing an existing transform in the scene or by creating a new empty GameObject for this purpose. Another benefit of planar collision is that particle systems with collision planes can be set up as prefabs. World collision uses raycasts so must be used with care in order to ensure good performance. However, for cases where approximate collisions are acceptable world collision in Low or Medium quality can be very efficient.

Properties common for any Collision Module

Planes/WorldSpecify the collision type: Planes for planar collision or World for world collisions.
Dampen(0-1) When the particle collides, it will keep this fraction of its speed. Unless it is set to 1.0, the particle will become slower after collision.
Bounce(0-1) When the particle collides, it will keep this fraction of the component of the velocity, which is normal to the plane of collision.
Lifetime Loss(0-1) The fraction of Start Lifetime lost on each collision. When lifetime reaches 0, the particle dies. For example if a particle should die on first collision, set this to 1.0.
Min Kill SpeedThe minimum speed of a particle before it is killed.

Properties available only in the Planes Mode

PlanesPlanes are defined by assigning a reference to a transform. This transform can be any transform in the scene and can be animated. Multiple planes can be used. Note: the Y-axis is used as the normal of a plane.
VisualizationOnly used for visualizing the planes: Grid or Solid.
GridRendered as gizmos and is useful for quick indication of position and orientation in the world.
SolidRenders a plane in the scene which is useful for exact positioning of a plane.
Scale PlaneResizes the visualization planes.
Particle RadiusThe assumed radius of the particle for collision purposes.

Properties available only in the World Mode

Collides WithFilter for specifying colliders. Select Everything to colllide with the whole world.
Collision QualityThe quality of the world collision.
HighAll particles performs a scene raycast per frame. Note: This is CPU intensive, it should only be used with 1000 simultaneous particles (scene wide) or less.
MediumThe particle system receives a share of the globally set Particle Raycast Budget (see Particle Raycast Budget) in each frame. Particles are updated in a round-robin fashion where particles that do not receive a raycast in a given frame will lookup and use older collisions stored in a cache. Note: This collision type is approximate and some particles will leak, particularly at corners.
LowSame as Medium except the particle system is only awarded a share of the Particle Raycast Budget every fourth frame.
Voxel SizeDensity of the voxels used for caching intersections used in the Medium and Low quality setting. The size of a voxel is given in scene units. Usually, 0.5 - 1.0 should be used (assuming metric units).

Sub Emitter Module

This is a powerful module that enables spawning of other Particle Systems at the follwing particle events: birth, death or collision of a particle.

BirthSpawn another Particle System at birth of each particle in this Particle System
DeathSpawn another Particle System at death of each particle in this Particle System
CollisionSpawn another Particle System at collision of each particle in this Particle System. IMPORTANT: Collision needs to be set up using the Collision Module. See Collision Module

Texture Sheet Animation Module

Animates UV coordinates of the particle over its lifetime. Animation frames can be presented in a form of a grid or every row in the sheet can be separate animation. The frames are animated with curves or can be a random frame between two curves. The speed of the animation is defined by "Cycles".

IMPORTANT: The texture used for animation is the one used by the material found in the Renderer module.
TilesDefine the tiling of the texture.
AnimationSpecify the animation type: Whole Sheet or Single Row.
Whole SheetUses the whole sheet for uv animation
- Frame over TimeControls the uv animation frame of each particle during its lifetime over the whole sheet. Use constant, animate it using a curve or specify a random frame using two curves. See MinMaxCurve.
Single RowUses a single row of the texture sheet for uv animation
- Random RowIf checked the start row will be random and if unchecked the row index can be specified (first row is 0).
- Frame over TimeControls the uv animation frame of each particle during its lifetime within the specified row. Use constant, animate it using a curve or specify a random frame using two curves. See MinMaxCurve.
- CyclesSpecify speed of animation.

Renderer Module

The renderer module exposes the ParticleSystemRenderer component's properties. Note that even though a GameObject has a ParticleSystemRenderer component, its properties are only exposed here, when this module is removed/added. It is actually the ParticleSystemRenderer component that is added or removed.

Render ModeSelect one of the following particle render modes
BillboardMakes the particles always face the camera
Stretched BillboardParticles are stretched using the following parameters
- Camera ScaleHow much the camera speed is factored in when determining particle stretching
- Speed ScaleDefines the length of the particle compared to its speed
- Length ScaleDefines the length of the particle compared to its width
Horizontal BillboardMakes the particles align with the Y axis
Vertical BillboardMakes the particles align with the XZ plane while facing the camera
MeshParticles are rendered using a mesh instead of a quad
- MeshThe reference to the mesh used for rendering particles
Normal DirectionValue from 0 to 1 that determines how much normals point toward the camera (0) and how much sideways toward the centre of the view (1).
MaterialMaterial used by billboarded or mesh particles.
Sort ModeThe draw order of particles can be sorted by distance, youngest first, or oldest first.
Sorting FudgeUse this to affect the draw order. Particle systems with lower sorting fudge numbers are more likely to be drawn last, and thus appear in front of other transparent objects, including other particles.
Cast ShadowsShould particles cast shadows? May or may not be possible depending on the material
Receive ShadowsShould particles receive shadows? May or may not be possible depending on the material
Max Particle SizeSet max relative viewport size. Valid values: 0-1

Page last updated: 2012-09-26



Particle System Grouping

An important feature of Unity's Particle System is that individual Particle Systems can be grouped by being parented to the same root. We will use the term Paricle Effect for such a group. Particle Systems belonging to the same Particle Effect, are played, stopped and paused together.

For managing complex particle effects, Unity provides a Particle Editor, which can be accessed from the Inspector, by pressing Open Editor

Overview of the Particle System Editor

You can toggle between Show: All and Show: Selected in this Editor. Show: All will render the entire particle effect. Show: Selected will only render the selected particle systems. What is selected will be highlighted with a blue frame in the Particle Editor and also shown in blue in the Hierarchy view. You can also change the selection both from the Hierarchy View and the Particle Editor, by clicking the icon in the top-left corner of the Particle System. To do a multiselect, use Ctrl+click on windows and Command+click on the Mac.

You can explicitly control rendering order of grouped particles (or otherwise spatially close particle emitters) by tweaking Sorting Fudge property in the Renderer module.

Particle Systems in the same hierarchy are considered as part of the same Particle Effect. This hierarchy shows the setup of the effect shown above.

Page last updated: 2012-08-28



Mecanim Animation System

Unity has a rich and sophisticated animation system called Mecanim. Mecanim provides:

Typical setup in the Visual Programming Tool and the Animation Preview window

Mecanim workflow

Workflow in Mecanim can be split into three major stages.

1. Asset preparation and import. This is done by artists or animators, with 3rd party tools, such as Max or Maya. This step is independent of Mecanim features.
2. Character setup for Mecanim, which can be done in 2 ways:

   a. Humanoid character setup. Mecanim has a special workflow for humanoid models, with extended GUI support and retargeting. The setup involves creating and setting up an Avatar and tweaking Muscle definitions.
b. Generic character setup. This is for anything like creatures, animated props, four-legged animals, etc. Retargeting is not possible here, but you can still take advantage of the rich feature set of Mecanim, including everything described below.

3. Bringing characters to life. This involves setting up animation clips, as well as interactions between them, and involves setup of State Machines and Blend Trees, exposing Animation Parameters, and controlling animations from code.

Mecanim comes with a lot of new concepts and terminology. If at any point, you need to find out what something means, go to our Animation Glossary.

Legacy animation system

While Mecanim is recommended for use in most situations, especially for working humanoid animations, the Legacy animation system is still used in a variety of contexts. One of them is working legacy animations and code (content created before Unity 4.0). Another is controlling animation clips with parameters other than time (for example for controlling the aiming angle). For information on the Legacy animation system, see this section

Unity intends to phase out the Legacy animation system over time for all cases by merging the workflows into Mecanim.

Page last updated: 2012-11-02



A glossary of animation and Mecanim terms

IconTermDescriptionType of ConceptUsage/Comments
Animation Clip related terms
Animation ClipAnimation data that can be used for animated characters or simple animations. It is a simple "unit" piece of motion, such as (one specific instance of) "Idle", "Walk" or "Run"sub-Asset 
Body MaskA specification for which body parts to include or exclude for a skeletonAsset (.mask)Used in Animation Layers and in the importer
 Animation
Curves
Curves can be attached to animation clips and controlled by various parameters from the game  
Avatar related terms
AvatarAn interface for retargeting one skeleton to anothersub-Asset 
 RetargetingApplying animations created for one model to anotherProcess 
 RiggingThe prcoess of building a skeleton hierarchy of bone joints for your meshProcessdone in an external tool, such as Max or Maya
 SkinningThe process of binding bone joints to the character's mesh or 'skin'Processdone in an external tool, such as Max or Maya
 Muscle DefinitionA Mecanim concept, which allows you to have a more intuitive control over the character's skeleton. When an Avatar is in place, Mecanim works in muscle space, which is more intuitive than bone space  
 T-poseThe pose in which the character has his arms straight out to the sides, forming a "T". The required pose for the character to be in, in order to make an Avatar  
 Bind-poseThe pose at which the character was modelled  
Human templateA pre-defined bone-mappingAsset (.ht)Used for matching bones from FBX files to the Avatar.
Animator and Animator Controller related terms
 Animator ComponentComponent on a model that animates that model using the Mecanim animation system. The component has a reference to an Animator Controller asset that controls the animation.Component 
 Root MotionMotion of character's root, whether it's controlled by the animation itself or externally.  
Animator Controller (Asset)
The Animator Controller controls animation through Animation Layers with Animation State Machines and Animation Blend Trees, controlled by Animation Parameters. The same Animator Controller can be referenced by multiple models with Animator components.Asset (.controller) 
 Animator Controller (Window)The window where the Animator Controller Asset is visualized and edited.Window 
 Animation LayerAn Animation Layer contains an Animation State Machine that controls animations of a model or part of it. An example of this is if you have a full-body layer for walking / jumping and a higher layer for upper-body motions such as throwing object / shooting. The higher layers take precedence for the body parts they control.  
 Animation State MachineA graph controlling the interaction of Animation States. Each state references an Animation Blend Tree or a single Animation Clip.  
 Animation Blend TreeUsed for continuous blending between similar Animation Clips based on float Animation Parameters.  
 Animation ParametersUsed to communicate between scripting and the Animator Controller. Some parameters can be set in scripting and used by the controller, while other parameters are based on Custom Curves in Animation Clips and can be sampled using the scripting API.  
 Inverse Kinematics (IK)The ability to control the character's body parts based on various objects in the world.  
Non-Mecanim animation terms
 Animation ComponentThe component needed for non-Mecanim animationsComponent 

Page last updated: 2012-11-07



Asset Preparation and Import

Humanoid meshes

In order to take full advantage of Mecanim's humanoid animation system and retargeting, you need to have a rigged and skinned humanoid type mesh.

  1. A character model is generally made up of polygons in a 3D package or converted to polygon or triangulated mesh, from a more complex mesh type before export.
  2. A joint hierarchy or skeleton which defines the bones inside the mesh and their movement in relation to one another, must be created to control the movement of the character. The process for creating the joint hierarchy is known as rigging.
  3. The mesh or skin must then be connected to the joint hierarchy in order to define which parts of the character mesh move when a given joint is animated. The process of connecting the skeleton to the mesh is known as skinning.
Stages for preparing a character (modeling, rigging, and skinning)

How to obtain humanoid models

There are three main ways to obtain humanoid models for with the Mecanim Animation system:

  1. Use a procedural character system or character generator such as Poser, Makehuman or Mixamo. Some of these systems will rig and skin your mesh (eg, Mixamo) while others will not. Furthermore, these methods may require that you reduce the number of polygons in your original mesh to make it suitable for use in Unity.
  2. Purchase demo examples and character content from the Unity Asset Store.
  3. Also, you can of course prepare your own character from scratch.

Export & Verify

Unity imports a number of different generic and native 3D file formats. The format we recommend for exporting and verifying your model is FBX 2012 since it will allow you to:

Further details

The following pages cover the stages of preparing and importing animation assets in greater depth

(back to Mecanim introduction)

Page last updated: 2012-11-01



Preparing your own character

There are three main steps in creating an animated humanoid character from scratch: modelling, rigging and skinning.

Modelling

This is the process of creating your own humanoid mesh in a 3D modelling package - 3DSMax, Maya, Blender, etc. Although this is a whole subject in its own right, there are a few guidelines you can follow to ensure a model works well with animation in a Unity project.

Skin Mesh - Modelled, textured and triangulated

Rigging

This is the process of creating a skeleton of joints to control the movements of your model.

3D packages provide a number of ways to create joints for your humanoid rig. These range from ready-made biped skeletons that you can scale to fit your mesh, right through to tools for individual bone creation and parenting to create your own bone structure. Although the details are outside the scope of Unity, here are some general guidelines:

Biped Skeleton, positioned in T-pose

Skinning

This is the process of attaching the mesh to the skeleton

Skinning involves binding vertices in your mesh to bones, either directly (rigid bind) or with blended influence to a number of bones (soft bind). Different software packages use different methods, eg, assigning individual vertices and painting the weighting of influence per bone onto the mesh. The initial setup is typically automated, say by finding the nearest influence or using "heatmaps". Skinning usually requires a fair amount of work and testing with animations in order to ensure satisfactory results for the skin deformation. Some general guidelines for this process include:

Interactive Skin Bind, one of many skinning methods

(back to AssetPreparationandImport)

(back to Mecanim introduction)

Page last updated: 2012-11-01



Importing Animations

Before a character model can be used, it must first be imported into your project. Unity can import native Maya (.mb or .ma) and Cinema 4D (.c4d) files, and also generic FBX files which can be exported from most animation packages (see this page for further details on exporting). To import an animation, simply drag the model file to the Assets folder of your project. When you select the file in the Project View you can edit the Import Settings in the inspector:-


The Import Settings Dialog for a mesh

See the FBX importer page for a full description of the available import options.

Splitting animations

(back to Mecanim introduction)

Page last updated: 2012-11-02



Splitting animations

An animated character typically has a number of different movements that are activated in the game in different circumstances. For example, it might need separate animations for walking, running, jumping, throwing, dying, etc. Depending on the way the model was animated, these separate movements might be imported as distinct animation clips or as one single clip where each movement simply follows on from the previous one. In cases where there is only a single clip, the clip must be split into its component animation sequences within Unity, which will involve some extra steps in your workflow.

Working with models that have pre-split animations

The simplest types of models to work with are those that contain pre-split animations. If you have an animation like that, the Animations tab in the Animation Importer Inspector will look like this:

You will see a list available clips which you can preview by pressing Play in the Preview Window (lower down in the inspector). The frame ranges of the clips can be edited, if needed.

Working with models that have unsplit animations

For models where the clips are supplied as one continuous animation, the Animation tab in the Animation Importer Inspector will look like this:

In cases like this, you can define the frame ranges that correspond to each of the separate animation sequences (walking, jumping, etc). You can create a new animation clip by pressing (+) and selecting the range of frames that are included in it.

For example:


The Import Settings Options for Animation

In the Import Settings, the Split Animations table is where you tell Unity which frames in your asset file make up which Animation Clip. The names you specify here are used to activate them in your game.

nameDefines the Animation Clip's name within Unity.
StartThe first frame of the animation. The frame number refers to the same frame as in the 3D program used to create the animation.
EndThe last frame of the animation.
Wrap ModeDefines how should time beyond the playback range of the clip be treated (Once, Loop, PingPong, ClampForever).
Add Loop FrameIf enabled, an extra loop frame is inserted at the end of the animation. This frame matches the first frame in the clip. Use this if you want to make a looping animation and the first & last frames don't match up exactly.

Working with animation clips for Mecanim animations.

Lock PoseLock Pose
Lock Root RotationLock Root Rotation
Lock HeightLock Height
Lock Root PositionLock root position
Rotation OffsetRotation Offset
Cycle OffsetCycle Offset
MirrorMirror
Body MaskThe parts of the body this animation clip affects
CurvesParametric curves

Adding animations to models that do not contain them

You can add animation clips to an Animation component even for models without muscle definitions (ie, non-Mecanim). You need to specify the default animation clip in the Animation property, and the available animation clips in the Animations property. The animation clips you add to such a non-Mecanim model should also be setup in a non-Mecanim way (ie, the Muscle Definition property should be set to None)

For models that have muscle definitions (Mecanim), the process is different:-

Importing Animations using multiple model files

Another way to import animations is to follow a naming scheme that Unity allows for the animation files. You create separate model files and name them with the convention 'modelName@animationName.fbx'. For example, for a model called "goober", you could import separate idle, walk, jump and walljump animations using files named "goober@idle.fbx", "goober@walk.fbx", "goober@jump.fbx" and "goober@walljump.fbx". Only the animation data from these files will be used, even if the original files are exported with mesh data.

An example of four animation files for an animated character (note that the .fbx suffix is not shown within Unity)

Unity automatically imports all four files and collects all animations to the file without the @ sign in. In the example above, the goober.mb file will be set up to reference idle, jump, walk and wallJump automatically.

For FBX files, simply export a model file with no animation ticked (eg, goober.fbx) and the 4 clips as goober@animname.fbx by exporting the desired keyframes for each (enable animation in the FBX dialog).

(back to Mecanim introduction)

Page last updated: 2012-11-02



Avatar Creation and Setup

The Mecanim Animation System is particularly well suited for working with animations for humanoid skeletons. Since humanoid skeletons are a very common special case and are used extensively in games, Unity provides a specialized workflow, and an extended tool set for humanoid animations.

Because of the similarity in bone structure, it is possible to map animations from one humanoid skeleton to another, allowing retargeting and inverse kinematics With rare exceptions, humanoid models can be expected to have the same basic structure, representing the major articulate parts of the body, head and limbs. The Mecanim system makes good use of this idea to simplify the rigging and control of animations. A fundamental step in creating a animation is to set up a mapping between the simplified humanoid bone structure understood by Mecanim and the actual bones present in the skeleton; in Mecanim terminology, this mapping is called an Avatar. The pages in this section explain how to create an Avatar for your model.

Page last updated: 2012-11-05



Creating the Avatar

FBXファイルをインポートした後、FBX importerオプションRigタブでリグの指定をすることができます。

Humanoidアニメーション

Humanoidリグの場合、Humanoidを選択しApplyをクリックします。MecanimはAvatarのボーン構造に現在のボーン構造のマッチングを試みます。多くの場合、リグのボーンのつながりを正しく分析し、ほぼ自動作業となります。

マッチングが成功した場合は、Configure...メニューの横にチェックマークが表示されます。

また、マッチングが成功した場合には、FBXのアセットにAvatarの子アセットが追加されたことがプロジェクト・ビュー階層にて表示されます。


Avatarの子アセットがある場合とない場合のモデル

インスペクタ上のAvatarアセット

MecanimがAvatarを作成することができなかった場合は、Configure ...ボタンの横に×印が表示され、アバターの子アセットが追加されません。これが起こるときには、Avatarを手動で設定する 必要があります。

!非Humanoid アニメーション

<<<<<<< HEAD 非Humanoidアニメーションのための2つのオプション(GenericおよびLegacy)が用意されています。GenericアニメーションはMecanimを使用してインポートが出来ますが、その際にHumanoidアニメーションで使用できるいくつかのすぐれた追加機能を利用できません。LegacyアニメーションはMecanim登場以前にUnityで提供されていたアニメーションシステムを使用しています。従来のアニメーションもでまだ有用なケースはありますが(特にあなたが完全にはアップデートしたくない過去プロジェクトを含む場合)、新規プロジェクトではほぼ必要ありません。Legacyアニメーションの詳細については、マニュアルのこのセクション を参照してください。 ======= 非ヒューマノイドアニメーションのための2つのオプション(GenericおよびLegacy)が用意されています。Genericアニメーションはメカニムを使用してインポートが出来ますが、その際にヒューマノイドアニメーションで使用できるいくつかのすぐれた追加機能を利用できません。Legacyアニメーションはメカニム登場以前にUnityで提供されていたアニメーションシステムを使用しています。従来のアニメーションもでまだ有用なケースはありますが(特にあなたが完全にはアップデートしたくない過去プロジェクトを含む場合)、新規プロジェクトではほぼ必要ありません。Legacyアニメーションの詳細については、マニュアルのこのセクション を参照してください。 > 7666ec2514acb2daf64e9ee61e3f7098c24d3470

(Avatar作成およびセットアップ に戻る)

(Mecanim紹介 に戻る)

Page last updated: 2012-11-26



Configuring the Avatar

Since the Avatar is such an important aspect of the Mecanim system, it is important that it is configured properly for your model. So, whether the automatic Avatar creation fails or succeeds, you need to go into the Configure Avatar mode to ensure your Avatar is valid and properly set up. It is important that your character's bone structure matches Mecanim's predefined bone structure and that the model is in T-pose.

If the automatic Avatar creation fails, you will see a cross next to the Configure button.

If it succeeds, you will see a check/tick mark:

Here, success simply means all of the required bones have been matched but for better results, you might want to match the optional bones as well and get the model into a proper T-pose.

When you go to the Configure ... menu, the editor will ask you to save your scene. The reason for this is that in Configure mode, the Scene View is used to display bone, muscle and animation information for the selected model alone, without displaying the rest of the scene.

Once you have saved the scene, you will see a new Avatar Configuration inspector, with a bone mapping.

The inspector shows which of the bones are required and which are optional - the optional ones can have their movements interpolated automatically. For Mecanim to produce a valid match, your skeleton needs to have at least the required bones in place. In order to improve your chances for finding a match to the Avatar, name your bones in a way that reflects the body parts they represent (names like "LeftArm", "RightForearm" are suitable here).

If the model does NOT yield a valid match, you can manually follow a similar process to the one used internally by Mecanim:-

  1. Sample Bind-pose (try to get the model closer to the pose with which it was modelled, a sensible initial pose)
  2. Automap (create a bone-mapping from an initial pose)
  3. Enforce T-pose (force the model closer to T-pose, which is the default pose used by Mecanim animations)

If the auto-mapping (Mapping->Automap) fails completely or partially, you can assign bones by either draging them from the Scene or from the Hierarchy. If Mecanim thinks a bone fits, it will show up as green in the Avatar Inspector, otherwise it shows up in red.

Finally, if the bone assignment is correct, but the character is not in the correct pose, you will see the message "Character not in T-Pose". You can try to fix that with Enforce T-Pose or rotate the remaining bones into T-pose.

ヒューマン テンプレートファイル

アバターにスケルトンのボーンマッピングはヒューマン テンプレートファイルとしてディスク上に保存することができます(拡張子*.ht)。このファイルにより、同じマッピングを使用する任意のキャラクターでボーンマッピングを再利用できます。たとえば、アニメーターで一貫性のあるレイアウトと一貫性のあるスケルトン命名規則を使用したものの、メカニムがそれを解釈できないケースに有効です。

各モデルで.htファイルをロードすることで、手動での再マッピングは一回だけで済ませることができる。

(back to Avatar Creation and Setup)

(back to Mecanim introduction)

Page last updated: 2012-11-06



Muscle Definitions

Mecanim allows you to control the range of motion of different bones using Muscles.

Once the Avatar has been properly configured, Mecanim will "understand" the bone structure and allow you to start working in the Muscles tab of the Avatar Inspector. Here, it is very easy to tweak the character's range of motion and ensure the character deforms in a convincing way, free from visual artifacts or self-overlaps.

You can either adjust individual bones in the body (lower part of the view) or manipulate the character using predefined deformations which operate on several bones at once (upper part of the view).

Muscle Clips

In the Animation tab, you can set up Muscle Clips, which are animations for specific muscles and muscle groups.

You can also define which body parts these muscle clips apply to.

(back to Avatar Creation and Setup)

(back to Mecanim introduction)

Page last updated: 2012-11-02



Avatar Body Mask

delete english ニメーションでBody Maskと呼ばれるものを使用して、特定の体の部分を選択的に有効または無効にすることができます。Body MaskはメッシュインポートインスペクタのAnimationタブとAnimation Layers で使用されています。 Body Maskによって、キャラクターの特定の要件に合わせてアニメーションを詳細にカスタマイズできます。たとえば、腕と脚の動きの両方を含む標準的な歩行アニメーションがあったとして、キャラクターが両手で大きな物体を運んでいる場合は、歩行中に腕が大きくスイングするのは不自然です。ただし、Body Maskで腕の動きをオフにすることで、標準の歩行アニメーションを活用することができます。

ボディパーツに含まれるのは、頭、左腕、右腕、左手、右手、左足、右足とルート(足の下の影部分)です。 Body Maskでは、手や足でインバースキネマティクス(IK)を切り替えることでIK曲線をアニメーションに含めるか決定することができます。

インスペクタ上の'Body Mask(腕を除く)

メッシュインポートインスペクタのアニメーションタブでは、Clipsというリストがあり、オブジェクトのすべてのアニメーションクリップが含まれています。このリストから項目を選択すると、Body Maskエディタを含め、アニメーションクリップに対して設定できるオプションが表示されます。

またBody Maskのアセット作成(メニューでAssets->Create->Avatar Body Maskを」選択)により.mask拡張子のファイルが作成されます。

Body Maskは、Animation Layers を指定する際にアニメータコントローラ で再利用することができます。

Body Maskを使用することの利点は、これらはアクティブではないボディパーツがそれに関連付けられたアニメーションカーブを必要としないため、メモリのオーバーヘッドを減少させやすい、ということです。さらに、未使用のアニメーションカーブは再生中に計算する必要がないためアニメーションによるCPUオーバーヘッドを削減しやすくなります。

(メカニム紹介 に戻る)

Page last updated: 2012-10-18



Retargeting

One of the most powerful features of Mecanim is retargeting of humanoid animations. This means that with relative ease, users can apply the same set of animations to various character models. Retargeting is only possible for humanoid models, where an Avatar has been configured, because this gives us a correspondence between the models' bone structure.

Recommended Hierarchy structure

When working with Mecanim animations, you can expect your scene to contain the following elements:-

Your project should also contain another character model with a valid Avatar.

If in doubt about the terminology, please consult the Animation Glossary

The recommended setup is to:

Then in order to reuse the same animations on another model, you need to:

(back to Mecanim introduction)

Page last updated: 2012-11-07



Inverse Kinematics

Most animation is produced by rotating the angles of joints in a skeleton to predetermined values. The position of a child joint changes according to the rotation of its parent and so the end point of a chain of joints can be determined from the angles and relative positions of the individual joints it contains. This method of posing a skeleton is known as forward kinematics.

However, it is often useful to look at the task of posing joints from the opposite point of view - given a chosen position in space, work backwards and find a valid way of orienting the joints so that the end point lands at that position. This can be useful when you want a character to touch an object at a point selected by the user or plant its feet convincingly on an uneven surface. This approach is known as Inverse Kinematics (IK) and is supported in Mecanim for any humanoid character with a correctly configured Avatar.

To set up IK for a character, you typically have objects around the scene that a character interacts with, and then set up the IK thru script, in particular, Animator functions like SetIKPositionWeight, SetIKRotationWeight, SetIKPosition, SetIKRotation, SetLookAtPosition, bodyPosition, bodyRotation

In the illustration above, we show a character grabbing a cylindrical object. How do we make this happen?

We start out with a character that has a valid Avatar, and attach to it a script that actually takes care of the IK, let's call it IKCtrl:

using UnityEngine;
using System;
using System.Collections;

[RequireComponent(typeof(Animator))]  

public class IKCtrl : MonoBehaviour {

	protected Animator animator;

	public bool ikActive = false;
	public Transform rightHandObj = null;

	void Start () 
	{
		animator = GetComponent<Animator>();
	}

        //a callback for calculating IK
	void OnAnimatorIK()
	{
	      if(animator) {

                        //if the IK is active, set the position and rotation directly to the goal. 
			if(ikActive) {

                                //weight = 1.0 for the right hand means position and rotation will be at the IK goal (the place the character wants to grab)
				animator.SetIKPositionWeight(AvatarIKGoal.RightHand,1.0f);
				animator.SetIKRotationWeight(AvatarIKGoal.RightHand,1.0f);

			        //set the position and the rotation of the right hand where the external object is
				if(rightHandObj != null) {
					animator.SetIKPosition(AvatarIKGoal.RightHand,rightHandObj.position);
					animator.SetIKRotation(AvatarIKGoal.RightHand,rightHandObj.rotation);
				}					

			}

                        //if the IK is not active, set the position and rotation of the hand back to the original position
			else {			
				animator.SetIKPositionWeight(AvatarIKGoal.RightHand,0);
				animator.SetIKRotationWeight(AvatarIKGoal.RightHand,0);				
			}
		}
	}	  
}

As we do not intend for the character to grab the entire object with his hand, we position a sphere where the hand should be on the cylinder, and rotate it accordingly.

This sphere should then be placed as the "Right Hand Obj" property of the IKCtrl script

Observe the character grabbing and ungrabbing the object as you click the IKActive checkbox

(back to Mecanim introduction)

Page last updated: 2012-11-07



Generic Animations

The full power of Mecanim is most evident when you are working with humanoid animations. However, non-humanoid animations are also supported although without the avatar system and other features. In Mecanim terminology, non-humanoid animations are referred to as Generic Animations.

To start working with a generic skeleton, go to the Rig tab in the FBX importer and choose Generic from the Animation Type menu.

Root node in generic animations

While in the case of humanoid animations, we have the knowledge about the center of mass and orientation, in the case of Generic animations, the skeleton can be arbitrary, and we need to specify a reference bone, or the "root node". Selecting the root node allows us to establish correspondence between animation clips for a generic model, and blend properly between animations that are not "in place". The root node is also essential for separating animation of bones relative to reach other and motion of the root in the world (controlled from OnAnimatorMove)

Page last updated: 2012-11-06



Bringing characters to life

Page last updated: 2012-10-04



Looping Animation Clips

A common operation for people working with animations is to make sure they loop properly. It is important, for example, that the animation clip representing the walk cycle, begins and ends in a similar pose (e.g. left foot on the ground), to ensure there is no foot sliding, or strange jerky motions. Mecanim provides convenient tools for this. Animation clips can loop based on pose, rotation, and position.

If you drag the Start or End points of the animation clip, you will see the Looping fitness curves for all of the paramers based on which it is possible to loop. If you place the Start / End marker in a place where the curve for the property is green, it is more likely that the clip can loop properly. The loop match indicator will show how good the looping is for the selected ranges.

Clip ranges with bad match for Loop Pose

Clip ranges with good match for Loop Pose

Once the loop match indicator is green, Enabling Loop Pose (for example) will make sure the looping of the pose is artifact-free.

For more details on animation clip options, see Animation Clip reference
(back to Mecanim introduction)

Page last updated: 2012-11-16



Animator Component and Window

アニメーター コンポーネント

アバターを持っている任意のゲームオブジェクトは’’アニメーターコンポーネント’’を持つことになり、またキャラクターとその動作の関係が定義されます。

アニメーターコンポーネントはアニメーターコントローラを参照し、これによりキャラクタの動作を設定します。ステートマシン , ブレンドツリー およびスクリプトから制御可能なイベントも設定できます。

プロパティ

Controllerこのキャラクターにアタッチされたアニメーターコントローラ
Avatarこのキャラクターのアバター
Apply Root Motion キャラクターの位置をアニメーション自体から制御するかスクリプトから制御するかどうか
Animate Physicsアニメーションが物理挙動と連動する必要があるか
Culling Modeアニメーションのカリングモード
Always animateつねにアニメーションし、カリングを行わない
Based on Renderersレンダラが非表示の場合ルートモーションのみがアニメーションされます。キャラクターが非表示の場合、体の他の部分はすべてスタティックとなります。

Animator Controller

Animator Controllerビューからキャラクター動作を表示、セットアップすることができます。(メニューからWindow > Animator Controllerを選択)

Animator ControllerProject Viewから作成することができます。(メニューからCreate > Animator Controllerを選択) これにより .controllerアセットがディスク上に作成され、Project Browserで次のように表示されます。

ディスク上のAnimator Controllerアセット

ステートマシンのセットアップが行われた後、Hierarchy ViewでAvatarで任意のキャラクターのAnimatorコンポーネント上にコントローラをドラッグ&ドロップすることができます。

Animator Controllerウィンドウ

Animator Controllerウィンドウは以下を含みます:

Animator Controller ウィンドウは現在読み込まれているかのシーンが何であるかにかかわらず、常に最近選択された.controllerアセットからステートマシンを表示することに注意して下さい。

(back to Mecanim introduction)

Page last updated: 2012-10-18



Animation State Machines

It is common for a character to have several different animations that correspond to different actions it can perform in the game. For example, it may breathe or sway slightly while idle, walk when commanded to and raise its arms in panic as it falls from a platform. Controlling when these animations are played back is potentially quite a complicated scripting task. Mecanim borrows a computer science concept known as a state machine to simplify the control and sequencing of a character's animations.

State machine basics

The basic idea is that a character is engaged in some particular kind of action at any given time. The actions available will depend on the type of gameplay but typical actions include things like idling, walking, running, jumping, etc. These actions are referred to as states, in the sense that the character is in a "state" where it is walking, idling or whatever. In general, the character will have restrictions on the next state it can go to rather than being able to switch immediately from any state to any other. For example, a running jump can only be taken when the character is already running and not when it is at a standstill, so it should never switch straight from the idle state to the running jump state. The options for the next state that a character can enter from its current state are referred to as state transitions. Taken together, the set of states, the set of transitions and the variable to remember the current state form a state machine.

The states and transitions of a state machine can be represented using a graph diagram, where the nodes represent the states and the arcs (arrows between nodes) represent the transitions. You can think of the current state as being a marker or highlight that is placed on one of the nodes and can then only jump to another node along one of the arrows.

The importance of state machines for animation is that they can be designed and updated quite easily with relatively little coding. Each state has an animation sequence associated with it that will play whenever the machine is in that state. This enables an animator or designer to define the possible sequences of character actions and animations without being concerned about how the code will work.

Mecanim state machines

Mecanim's Animation State Machines provide a way to overview all of the animation clips related to a particular character and allow various events in the game (for example user input) to trigger different animations.

Animation State Machines can be set up from the Animator Controller Window, and they look something like this:

State Machines consist of States, Transitions and Events and smaller Sub-State Machines can be used as components in larger machines.

(back to Mecanim introduction)

Page last updated: 2012-11-02



Animation States

Animation State

Animation StateAnimation State Machinesの基本構成要素です。各ステート(状態)は、個々のアニメーションシーケンス(またはブレンドツリー)が含まれていて、キャラクターがそのステートの時に再生されます。ゲーム内のイベントで、ステート遷移をトリガすると、キャラクターは新しいステートにに移行し、対応するアニメーションシーケンスに動作が遷移します。

アニメーターコントローラーのステートを選択すると、インスペクタ上で、そのステートに対応するプロパティが表示されます。:-

Speedアニメーションのデフォルトの速度
Motionステートに割り当てられているアニメーションクリップ
Foot IKステートで足のIKを有効にするか
Transitionsステートの遷移先ステート一覧

茶色で表示されるデフォルトのステートは、最初に起動されたときのステートです。デフォルトの状態を変更したい場合は、別のステート上で右クリックし、コンテキストメニューからSet As Defaultを選択します。各遷移上soloおよびmuteのチェックボックスはAnimation Viewの動作を制御するために使用されています。詳細はこのページ を参照のこと。

新しいステートの追加時はがAnimator Controller Windowのどこかを右クリックし、コンテキストメニューでCreate State→Emptyを選択します。別の方法としては、AnimatorControllerWindowにアニメーションをドラッグすることで、そのアニメーションを含むステートを作成することが出来ます。(コントローラーにはメカニムアニメーションをドラッグできることに留意してください。 非メカニムアニメーションはリジェクトされます。)ステートはブレンドツリー を含みます。

Any State

Any Stateは常駐している特殊なステートです。現在どのステートにいるかに影響を受けることなく、特定のステートに遷移したい場合のために存在している。これは、全ステートに同じ遷移先を追加するのと同じ効果がある。Any Stateは、その特殊の機能により、ステートの遷移先とすることはできません。(次の遷移先としてランダムなステートを選択するための手段としてはAny Stateは使用できませんので留意下さい。)

(Animation State Machines に戻る)

Page last updated: 2012-10-18



Animation Transitions

Animation Transitions

Animation Transitionsは、あるAnimation Stateから別のものに切り替えたときに何が起こるかを定義する。任意の時点でアクティブなAnimation Transitionsはひとつのみです。

Atomic遷移がアトミックか(中断が出来ない)
Conditionsいつ遷移がトリガーされるか

Conditionは2つの部分から成ります:

  1. 条件述語 (If, If Not, Less, Greater, Equals, Not Equal, および Exit Time)
  2. イベントパラメータ(IfとIf Notでbool型と連動、Exit Timeはtime型を使用)。
  3. パラメータ値(必要な場合)

2つのアニメーションクリップ間の遷移は、開始値と終了値をドラッグすることによって、重なりを調整することができます。

(Animation State Machines にもどる)

Page last updated: 2012-10-18



Animation Parameters

Animation Parameters expose the operation of the state machine to game logic. Events are triggered based on event parameters, activated from game logic. Typically you would work with events in 3 places:

  1. Setting up parameters in the Parameter Widget in the bottom-left corner of the Animator Controller Window
  2. Setting up conditions for transitions in the Transition Inspector, based on those parameters
  3. Controlling the parameters from script.

Event parameters can be of 4 basic types: Vector, Float, Int, and Bool, and they can be controlled from script via the functions SetVector, SetFloat, SetInt, and SetBool respectively.

Note that the values next to the parameters serve as Check default values for those parameters at startup, unless they're overriden by (or blended with) values from animation curves

Thus, a complete animated character in the scene will have both an Animator Component and a script that controls the parameters in the Animator.

Here's an example of a script that modifies event parameters based on user input

public class AvatarCtrl : MonoBehaviour {

	protected Animator animator;

	public float DirectionDampTime = .25f;

	void Start () 
	{
		animator = GetComponent<Animator>();
	}

	void Update () 
	{
		if(animator)
		{
                        //get the current state
			AnimatorStateInfo stateInfo = animator.GetCurrentAnimatorStateInfo(0);

                        //if we're in "Run" mode, respond to input for jump, and set the Jump parameter accordingly. 
        	        if(stateInfo.name == "Base Layer.RunBT")
			{
				if(Input.GetButton("Fire1")) 
                                    animator.SetBool("Jump", true );
			}
			else
			{
            	                animator.SetBool("Jump", false);				
			}

         		float h = Input.GetAxis("Horizontal");
                	float v = Input.GetAxis("Vertical");

                        //set event parameters based on user input
			animator.SetFloat("Speed", h*h+v*v);
                        animator.SetFloat("Direction", h, DirectionDampTime, Time.deltaTime);
		}		
	}   		  
}

(back to Animation State Machines)

Page last updated: 2012-10-26



Animation Blend Trees

Blend Trees are used for continuous blending between similar animations based on float event parameters. A typical example of this is blending between walk and run animations based on the speed parameter. Mecanim can ensure that the transition between the walk and the run is smooth (it is important that the animation clips are aligned: e.g. start with the left foot on the floor at 0.0, and have the right foot on the floor at 0.5 for both. Another typical example is a transition between RunLeft, Run and RunRight animations based on the direction parameter value between 0.0 (left) and 1.0 (right).

To start working with a new blend tree, you need to:

  1. Right-click on empty space on the Animator Controller Window
  2. Select From New Blend Tree.
  3. Double-click on the Blend Tree to bring up the Blend Tree Inspector.

In the inspector, the first thing you need is to select the Animation Parameter that will control this Blend Tree.

Then you can add individual animations by clicking + -> Add Motion Field to add an animation clip to the blend tree. When you're done, it should look something like this:

The red vertical bar indicates the current value of the event parameter. You can preview what happens to the animations by pressing Play in the Animation Preview Window, dragging the bar left and right.

FAQ: When should I use State Machines and when should I use Blend Trees?
Answer: State machines are used for transitioning between unrelated animations based on discrete thresholds or boolean parameters. Blend trees are used for blending continuously between similar animations based on continuous (float) parameters.

(back to Mecanim introduction)

Page last updated: 2012-11-06



Advanced topics

(back to Mecanim introduction)

Page last updated: 2012-10-08



Animation Curves in Mecanim

Animation curves can be attached to animation clips in the Animations tab of the Animation Import Settings.

The curves on animation clips in Mecanim

The curve's X-axis represents normalized time and always ranges between 0.0 and 1.0 (corresponding to the beginning and the end of the animation clip respectively, regardless of its duration).

Double-clicking an animation curve will bring up the standard Unity curve editor (see this page for further details) which you can use to add keys to the curve. Keys are points along the curve's timeline where it has a value explicitly set by the animator rather than just using an interpolated value. Keys are very useful for marking important points along the timeline of the animation. For example, with a walking animation, you might use keys to mark the points where the left foot is on the ground, then both feet on the ground, right foot on the ground, etc. Once the keys are set up, you can move conveniently between key frames by pressing the Previous/Next Key Frame buttons. This will move the vertical red line and show the normalized time at the keyframe; the value you enter in the text box will then set the value of the curve at that time.

Animation Curves and Animator Controller parameters

If you have a curve with the same name as one of the parameters in the Animator Controller, then that parameter will take its value from the value of the curve at each point in the timeline. For example, if you make a call to GetFloat from a script, the returned value will be equal to the value of the curve at the time the call is made. Note that at any given point in time, there might be multiple animation clips attempting to set the same parameter from the same controller. In that case, the curve values from the multiple animation clips are blended. If an animation has no curve for a particular parameter then the blending will be done with the default value for that parameter.

(back to Mecanim introduction)

Page last updated: 2012-11-07



Nested State Machines

For convenience, it is possible to nest Animation State Machines within other Animation State Machines. You can create a Sub-state machine by rightclicking on an empty space within the Animator Controller window and selecting Create Sub-State Machine.

This forms a sub-state machine, to which you can navigate by double-clicking on the rhombic node:

Note, however, that you can only connect from states to other states. Thus when you create a transition from a state to a state machine, Unity will ask you to select a state from that machine. You can connect both up and down the hierarchy.

The State Inspector and the Transition Inspector will indicate which state machine each state comes from:

(back to State Machines introduction)

(back to Mecanim introduction)

Page last updated: 2012-10-08



Animation Layers

Unity uses Animation Layers for managing complex state machines for different body parts. An example of this is if you have a lower-body layer for walking-jumping, and an upper-body layer for throwing objects / shooting.

You can manage animation layers from the Layers Widget in the top-left corner of the Animator Controller.

You can add a new layer by pressing the + on the widget. On each layer, you can specify the body mask (the part of the body on which the animation would be applied), and the Blending type. Override means information from other layers will be ignored, while Additive means that the animation will be added on top of previous layers.

The Mask property is there to specify the body mask used on this layer. For example if you want to use upper body throwing animations, while having your character walk or run, you would use an upper body mask, like this:

For more on Avatar Body Masks, you can read this section

Animation Layer syncing (Pro only)

Sometimes it is useful to be able to re-use the same state machine in different layers. For example if you want to simulate "wounded" behavior, and have "wounded" animations for walk / run / jump instead of the "healthy" ones. You can click the Sync checkbox on one of your layers, and then select the layer you want to sync with. The state machine structure will then be the same, but the actual animation clips used by the states will be distinct.

(back to Mecanim introduction)

Page last updated: 2012-11-07



Animation State Machine Preview (solo and mute)

Solo and Mute functionality

In complex state machines, it is useful to preview the operation of some parts of the machine separately. For this, you can use the Mute / Solo functionality. Muting means a transition will be disabled. Soloed transtions are enabled and with respect to other transitions originating from the same state. You can set up mute and solo states either from the Transition Inspector, or the State Inspector (recommended), where you'll have an overview of all the transitions from that state.

Soloed transitions will be shown in green, while muted transitions in red, like this:

In the example above, if you are in State 0, only transitions to State A and State B will be available.

Known issues:

(back to State Machines introduction)

(back to Mecanim introduction)

Page last updated: 2012-10-08



Target Matching

Often in games, a situation arises where a character must move in such a way that a hand or foot lands at a certain place at a certain time. For example, the character may need to jump across stepping stones or jump and grab an overhead beam.

You can use the Animator.MatchTarget function to handle this kind of situation. Say, for example, you want to arrange an situation where the character jumps onto a platform and you already have an animation clip for it called Jump Up. To do this, follow the steps below.

using UnityEngine;
using System;

[RequireComponent(typeof(Animator))]  
public class TargetCtrl : MonoBehaviour {

	protected Animator animator;	

	//the platform object in the scene
	public Transform jumpTarget = null; 
	void Start () {
		animator = GetComponent<Animator>();
	}

	void Update () {
		if(animator) {
			if(Input.GetButton("Fire1"))		       
				animator.MatchTarget(jumpTarget.position, jumpTarget.rotation, AvatarTarget.LeftFoot, new MatchTargetWeightMask(Vector3.one, 1f), 0.11f, 0.223f);
		}		
	}
}

Attach that script onto the Mecanim model.

The script will move the character so that it jumps from its current position and lands with its left foot at the target. Bear in mind that the result of using MatchTarget will generally only make sense if it is called at the right point in gameplay.

(back to Mecanim introduction)

Page last updated: 2012-11-07



Root Motion

Attach:MecanimRootMotionPreview.png Δ

Body Transform

You must start by computing a body transform that will be the same for all humanoid characters (from a retargeting standpoint). Use the body mass center as the body position. The body orientation is an average of the lower and upper body orientation. Body orientation is at identity for the Avatar T-Pose.

The body position and orientation are stored in the Animation Clip (using the Muscle definitions set up in the Avatar). They are the only world-space curves stored in the Animation Clip. Everything else: muscle curves and IK goals (Hands and Feet) are stored relative to the body transform.

For example, the Hips (or Pelvis, etc.) are usually used to store the world-space position and orientation of the animation. In a straight walk or run animation, the Hips will swing and twist left/right, up/down, but the center of mass will nearly follow a straight line. The average of the lower and upper body orientations will also be more stable, almost constant. The position of the hips will be different from one skeleton to another, depending on how they were modeled (sometimes in the middle of left and right hips, sometimes offset back or up). Its orientation will also be totally arbitrary. it is not a good choice to use it as world space transform for retargeting. Look at how the body transform behaves for a barrel jump where Hips totally fail!

Root Transform

The Root Transform is a projection on the Y plane of the Body Transform and is computed at runtime. At every Animator update, a delta Root Transform is computed for the current delta time. The delta transform is then apply to the Game Object to make it move.

The Animation Clip Editor settings (Root Transform Rotation, Root Transform Position (Y) and Root Transform Position (XZ)) let you control the Root Transform projection from the Body Transform. Depending on these settings some parts of the Body Transform may be transferred Root Transform. For example you can decide if you want the motion Y position to be part of the Root Motion (trajectory) or part of the pose (body transform), which is known as Baked into Pose.

Root Transform Rotation

Bake into Pose: The orientation will stay on the body transform (or Pose). The Root Orientation will be constant and delta Orientation will be identity. This means the the Game Object will not be rotated at all by that AnimationClip.

Only AnimationClips that have similar start and stop Root Orientation should use this option. You will have a Green Light in the UI telling you that an AnimationClip is a good candidate. A suitable candidate would be a straight walk or a run.

Based Upon: This let you set the orientation of the clip. Using Body Orientation, the clip will be oriented to follow the forward vector of body. This default setting works well for most Motion Capture (Mocap) data like walks, runs, and jumps, but it will fail with motion like strafing where the motion is perpendicular to the body's forward vector. In those cases you can manually adjust the orientation using the Offset setting. Finally you have Original that will automatically add the authored offset found in the imported clip. It is usually used with Keyframed data to respect orientation that was set by the artist.

Offset: used to enter the offset when that option is chosen for Based Upon.

Root Transform Position (Y)

This uses the same concept describe in Root Transform Rotation.

Bake Into Pose: The Y component of the motion will stay on the Body Transform (Pose). The Y component of the Root Transform will be constant and Delta Root Position Y will be 0. This means that this clip wont change the Game Object Height. Again you have a Green Light telling you that a clip is a good candidate for baking Y motion into pose.

Most of the AnimationClips will enable this setting. Only clips that will change the GameObject height should have this turned off, like jump up or down.

Note: the Animator.gravityWeight is driven by Bake Into Pose position Y. When enabled, gravityWeight = 1, when disable = 0. gravityWeight is blended for clips when transitioning between states.

Based Upon: In a similar way to Root Transform Rotation you can choose from Original or Mass Center (Body). There is also a Feet option that is very convenient for AnimationClips that change height (Bake Into Pose disabled). When using Feet the Root Transform Position Y will match the lowest foot Y for all frames. Thus the blending point always remains around the feet which prevents floating problem when blending or transitioning.

Offset: In a similar way to Root Transform Rotation, you can manually adjust the AnimationClip height using the Offset setting.

Root Transform Position (XZ)

Again, this uses same concept describe in Root Transform Rotation or Root Motion Position (Y).

Bake Into Pose will usually be used for Idles where you want to force the delta Position (XZ) to be 0. It will stop the accumulation of small deltas drifting after many evaluations. It can also be used for a Keyframed clip with Based Upon Original to force an authored position that was set by the artist.

Loop Pose

Loop Pose (like Pose Blending in Blend Trees or Transitions) happens in the referential of Root Transform. Once the Root Transform is computed, the Pose becomes relative to it. The relative Pose difference between Start and Stop frame is computed and distributed over the range of the clip from 0-100%.

Generic Root Motion and Loop Pose.

This works in essentially the same as Humanoid Root Motion, but instead of using the Body Transform to compute/project a Root Transform, the transfrom set in Root Node is used. The Pose (all the bones which transform below the Root Motion bone) is made relative to the Root Transform.

Page last updated: 2012-11-07



Scripting Root Motion

Sometimes your animation comes as "in-place", which means if you put it in a scene, it will not move the character that it's on. In other words, the animation does not contain "root motion". For this, we can modify root motion from script. To put everything together follow the steps below (note there are many variations of achieving the same result, this is just one recipe).

using UnityEngine;
using System.Collections;

[RequireComponent(typeof(Animator))]

public class RootMotionScript : MonoBehaviour {

	void OnAnimatorMove()
	{
            Animator animator = GetComponent<Animator>(); 

            if (animator)
            {
	       Vector3 newPosition = transform.position;
               newPosition.z += animator.GetFloat("Runspeed") * Time.deltaTime;                                 
	       transform.position = newPosition;
            }
	}
}

(back to Mecanim introduction)

Page last updated: 2012-11-06



Legacy Animation system

Unity's Animation System allows you to create beautifully animated skinned characters. The Animation System supports animation blending, mixing, additive animations, walk cycle time synchronization, animation layers, control over all aspects of the animation playback (time, speed, blend-weights), mesh skinning with 1, 2 or 4 bones per vertex and finally physically based ragdolls.

For best practices on creating a rigged character with optimal performance in Unity, we recommended that you check out the section on Modeling Optimized Characters.

The following topics are covered on this page:

Importing Animations

Spliting animations

Importing Inverse Kinematics

When importing animated characters from Maya that are created using IK, you have to check the Bake IK & simulation box in the Import Settings. Otherwise, your character will not animate correctly.

Bringing the character into the Scene

When you have imported your model you drag the object from the Project View into the Scene View or Hierarchy View


The animated character is added by dragging it into the scene

The character above has three animations in the animation list and no default animation. You can add more animations to the character by dragging animation clips from the Project View on to the character (in either the Hierarchy or Scene View). This will also set the default animation. When you hit Play, the default animation will be played.

TIP: You can use this to quickly test if your animation plays back correctly. Also use the Wrap Mode to view different behaviors of the animation, especially looping.

Page last updated: 2012-10-04



Animation Editor Guide (Legacy)

The Animation View in Unity allows you to create and modify Animation Clips directly inside Unity. It is designed to act as a powerful and straightforward alternative to external 3D animation programs. In addition to animating movement, the editor also allows you to animate variables of materials and components and augment your Animation Clips with Animation Events, functions that are called at specified points along the timeline.

See the pages about Animation import and Animation Scripting for further information about these subject.

The Animation View Guide is broken up into several pages that each focus on different areas of the View:-

Using the Animation View

This section covers the basic operations of the Animation View, such as creating and editing Animations Clips.

Using Animation Curves

This section explains how to create Animation Curves, add and move keyframes and set WrapModes. It also offers tips for using Animation Curves to their full advantage.

Editing Curves

This section explains how to navigate efficienlty in the editor, create and move keys, and edit tangents and tangent types.

Objects with Multiple Moving Parts

This section explains how to animate Game Objects with multiple moving parts and how to handle cases where there is more than one Animation Component that can control the selected Game Object.

Using Animation Events

This section explains how to add Animation Events to an Animation Clip. Animation Events allow you call a script function at specified points in the animation's timeline.

Page last updated: 2012-09-10



Animation Scripting (Legacy)

Unity's Animation System allows you to create beautifully animated skinned characters. The Animation System supports animation blending, mixing, additive animations, walk cycle time synchronization, animation layers, control over all aspects of the animation playback (time, speed, blend-weights), mesh skinning with 1, 2 or 4 bones per vertex as well as supporting physically based rag-dolls and procedural animation. To obtain the best results, it is recommended that you read about the best practices and techniques for creating a rigged character with optimal performance in Unity on the Modeling Optimized Characters page.

Making an animated character involves two things; moving it through the world and animating it accordingly. If you want to learn more about moving characters around, take a look at the Character Controller page. This page focuses on the animation. The actual animating of characters is done through Unity's scripting interface.

You can download example demos showing pre-setup animated characters. Once you have learned the basics on this page you can also see the animation script interface.

This page contains the following sections:-

Animation Blending

In today's games, animation blending is an essential feature to ensure that characters have smooth animations. Animators create separate animations, for example, a walk cycle, run cycle, idle animation or shoot animation. At any point in time during your game you need to be able to transition from the idle animation into the walk cycle and vice versa. Naturally, you want the transition to be smooth and avoid sudden jerks in the motion.

This is where animation blending comes in. In Unity you can have any number of animations playing on the same character. All animations are blended or added together to generate the final animation.

Our first step will be to make a character blend smoothly between the idle and walk animations. In order to make the scripter's job easier, we will first set the Wrap Mode of the animation to Loop. Then we will turn off Play Automatically to make sure our script is the only one playing animations.

Our first script for animating the character is quite simple; we only need some way to detect how fast our character is moving, and then fade between the walk and idle animations. For this simple test, we will use the standard input axes:-

function Update () {
   if (Input.GetAxis("Vertical") > 0.2)
       animation.CrossFade ("walk");
   else
      animation.CrossFade ("idle");
} 

To use this script in your project:-

  1. Create a Javascript file using Assets->Create Other->Javascript.
  2. Copy and paste the code into it
  3. Drag the script onto the character (it needs to be attached to the GameObject that has the animation)

When you hit the Play button, the character will start walking in place when you hold the up arrow key and return to the idle pose when you release it.

Animation Layers

Layers are an incredibly useful concept that allow you to group animations and prioritize weighting.

Unity's animation system can blend between as many animation clips as you want. You can assign blend weights manually or simply use animation.CrossFade(), which will animate the weight automatically.

Blend weights are always normalized before being applied

Let's say you have a walk cycle and a run cycle, both having a weight of 1 (100%). When Unity generates the final animation, it will normalize the weights, which means the walk cycle will contribute 50% to the animation and the run cycle will also contribute 50%.

However, you will generally want to prioritize which animation receives most weight when there are two animations playing. It is certainly possible to ensure that the weight sums up to 100% manually, but it is easier just to use layers for this purpose.

Layering Example

As an example, you might have a shoot animation, an idle and a walk cycle. The walk and idle animations would be blended based on the player's speed but when the player shoots, you would want to show only the shoot animation. Thus, the shoot animation essentially has a higher priority.

The easiest way to do this is to simply keep playing the walk and idle animations while shooting. To do this, we need to make sure that the shoot animation is in a higher layer than the idle and walk animations, which means the shoot animation will receive blend weights first. The walk and idle animations will receive weights only if the shoot animation doesn't use all 100% of the blend weighting. So, when CrossFading the shoot animation in, the weight will start out at zero and over a short period become 100%. In the beginning the walk and idle layer will still receive blend weights but when the shoot animation is completely faded in, they will receive no weights at all. This is exactly what we need!

function Start () {
   // Set all animations to loop
   animation.wrapMode = WrapMode.Loop;
   // except shooting
   animation["shoot"].wrapMode = WrapMode.Once;

   // Put idle and walk into lower layers (The default layer is always 0)
   // This will do two things
   // - Since shoot and idle/walk are in different layers they will not affect
   //   each other's playback when calling CrossFade.
   // - Since shoot is in a higher layer, the animation will replace idle/walk
   //   animations when faded in.
   animation["shoot"].layer = 1;

   // Stop animations that are already playing
   //(In case user forgot to disable play automatically)
   animation.Stop();
}

function Update () {
   // Based on the key that is pressed,
   // play the walk animation or the idle animation
   if (Mathf.Abs(Input.GetAxis("Vertical")) > 0.1)
      animation.CrossFade("walk");
   else
      animation.CrossFade("idle");

   // Shoot
   if (Input.GetButtonDown ("Fire1"))
      animation.CrossFade("shoot");
} 

By default the animation.Play() and animation.CrossFade() will stop or fade out animations that are in the same layer. This is exactly what we want in most cases. In our shoot, idle, run example, playing idle and run will not affect the shoot animation and vice versa (you can change this behavior with an optional parameter to animation.CrossFade if you like).

Animation Mixing

Animation mixing allow you to cut down on the number of animations you need to create for your game by having some animations apply to part of the body only. This means such animations can be used together with other animations in various combinations.

You add an animation mixing transform to an animation by calling AddMixingTransform() on the given AnimationState.

Mixing Example

An example of mixing might be something like a hand-waving animation. You might want to make the hand wave either when the character is idle or when it is walking. Without animation mixing you would have to create separate hand waving animations for the idle and walking states. However, if you add the shoulder transform as a mixing transform to the hand waving animation, the hand waving animation will have full control only from the shoulder joint to the hand. Since the rest of the body will not be affected by he hand-waving, it will continue playing the idle or walk animation. Consequently, only the one animation is needed to make the hand wave while the rest of the body is using the idle or walk animation.

/// Adds a mixing transform using a Transform variable
var shoulder : Transform;
animation["wave_hand"].AddMixingTransform(shoulder);

Another example using a path.

function Start () {
   // Adds a mixing transform using a path instead
   var mixTransform : Transform = transform.Find("root/upper_body/left_shoulder");
   animation["wave_hand"].AddMixingTransform(mixTransform);
}

Additive Animations

Additive animations and animation mixing allow you to cut down on the number of animations you have to create for your game, and are important for creating facial animations.

Suppose you want to create a character that leans to the sides as it turns while walking and running. This leads to four combinations (walk-lean-left, walk-lean-right, run-lean-left, run-lean-right), each of which needs an animation. Creating a separate animation for each combination clearly leads to a lot of extra work even in this simple case but the number of combinations increases dramatically with each additional action. Fortunately additive animation and mixing avoids the need to produce separate animations for combinations of simple movements.

Additive Animation Example

Additive animations allow you to overlay the effects of one animation on top of any others that may be playing. When generating additive animations, Unity will calculate the difference between the first frame in the animation clip and the current frame. Then it will apply this difference on top of all other playing animations.

Referring to the previous example, you could make animations to lean right and left and Unity would be able to superimpose these on the walk, idle or run cycle. This could be achieved with code like the following:-

private var leanLeft : AnimationState;
private var leanRight : AnimationState;

function Start () {
   leanLeft = animation["leanLeft"];
   leanRight = animation["leanRight"];

   // Put the leaning animation in a separate layer
   // So that other calls to CrossFade won't affect it.
   leanLeft.layer = 10;
   leanRight.layer = 10;

   // Set the lean animation to be additive
   leanLeft.blendMode = AnimationBlendMode.Additive;
   leanRight.blendMode = AnimationBlendMode.Additive;

   // Set the lean animation ClampForever
   // With ClampForever animations will not stop 
   // automatically when reaching the end of the clip
   leanLeft.wrapMode = WrapMode.ClampForever;
   leanRight.wrapMode = WrapMode.ClampForever;

   // Enable the animation and fade it in completely
   // We don't use animation.Play here because we manually adjust the time
   // in the Update function.
   // Instead we just enable the animation and set it to full weight
   leanRight.enabled = true;
   leanLeft.enabled = true;
   leanRight.weight = 1.0;
   leanLeft.weight = 1.0;

   // For testing just play "walk" animation and loop it
   animation["walk"].wrapMode = WrapMode.Loop;
   animation.Play("walk");
}

// Every frame just set the normalized time
// based on how much lean we want to apply
function Update () {
   var lean = Input.GetAxis("Horizontal");
   // normalizedTime is 0 at the first frame and 1 at the last frame in the clip
   leanLeft.normalizedTime = -lean;
   leanRight.normalizedTime = lean;
} 

Tip: When using Additive animations, it is critical that you also play some other non-additive animation on every transform that is also used in the additive animation, otherwise the animations will add on top of the last frame's result. This is most certainly not what you want.

Animating Characters Procedurally

Sometimes you want to animate the bones of your character procedurally. For example, you might want the head of your character to look at a specific point in 3D space which is best handled by a script that tracks the target point. Fortunately, Unity makes this very easy, since bones are just Transforms which drive the skinned mesh. Thus, you can control the bones of a character from a script just like the Transforms of a GameObject.

One important thing to know is that the animation system updates Transforms after the Update() function and before the LateUpdate() function. Thus if you want to do a LookAt() function you should do that in LateUpdate() to make sure that you are really overriding the animation.

Ragdolls are created in the same way. You simply have to attach Rigidbodies, Character Joints and Capsule Colliders to the different bones. This will then physically animate your skinned character.

Animation Playback and Sampling

This section explains how animations in Unity are sampled when they are played back by the engine.

AnimationClips are typically authored at a fixed frame rate. For example, you may create your animation in 3ds Max or Maya at a frame rate of 60 frames per second (fps). When importing the animation in Unity, this frame rate will be read by the importer, so the data of the imported animation is also sampled at 60 fps.

However, games typically run at a variable frame rate. The frame rate may be higher on some computers than on others, and it may also vary from one second to the next based on the complexity of the view the camera is looking at at any given moment. Basically this means that we can make no assumptions about the exact frame rate the game is running at. What this means is that even if an animation is authored at 60 fps, it may be played back at a different framerate, such as 56.72 fps, or 83.14 fps, or practically any other value.

As a result, Unity must sample an animation at variable framerates, and cannot guarantee the framerate for which it was originally designed. Fortunately, animations for 3D computer graphics do not consist of discrete frames, but rather of continuous curves. These curves can be sampled at any point in time, not just at those points in time that correspond to frames in the original animation. In fact, if the game runs at a higher frame rate than the animation was authored with, the animation will actually look smoother and more fluid in the game than it did in the animation software.

For most practical purposes, you can ignore the fact that Unity samples animations at variable framerates. However, if you have gameplay logic that relies on animations that animate transforms or properties into very specific configurations, then you need to be aware that the re-sampling takes place behind the scenes. For example, if you have an animation that rotates an object from 0 to 180 degrees over 30 frames, and you want to know from your code when it has reached half way there, you should not do it by having a conditional statement in your code that checks if the current rotation is 90 degrees. Because Unity samples the animation according to the variable frame rate of the game, it may sample it when the rotation is just below 90 degrees, and the next time right after it reached 90 degrees. If you need to be notified when a specific point in an animation is reached, you should use an AnimationEvent instead.

Note also that as a consequence of the variable framerate sampling, an animation that is played back using WrapMode.Once may not be sampled at the exact time of the last frame. In one frame of the game the animation may be sampled just before the end of the animation, and in the next frame the time can have exceeded the length of the animation, so it is disabled and not sampled further. If you absolutely need the last frame of the animation to be sampled exactly, you should use WrapMode.ClampForever which will keep sampling the last frame indefinitely until you stop the animation yourself.

Page last updated: 2012-09-05



Navmesh and Pathfinding

A navigation mesh (also known as the Navmesh) is a simplified representation of world geometry, which gameplay agents use to navigate the world. Typically an agent has a goal, or a destination, to which it is trying to find a path, and then navigate to that goal along the path. This process is called pathfinding. Note that Navmesh generation (or baking) is done by game developers inside the editor, while the pathfinding is done by agents at runtime based on that Navmesh.

In the complex world of games, there can be many agents, dynamic obstacles, and constantly changing accessibility levels for different areas in the world. Agents need to react dynamically to those changes. An agent's pathfinding task can be interrupted by or affected by things like collision avoidance with other characters, changing characteristics of the terrain, physical obstacles (such as closing doors), and an update to the actual destination.

Here is a simple example of how to set up a navmesh, and an agent that will do pathfinding on it:

Note that it is also possible to define custom NavMesh layers. These are needed for situations where some parts of the environment are easier for agents to pass through than others. For parts of the mesh that are not directly connected, it is possible to create Off Mesh Links.

Automatic off-mesh links

Navmesh geometry can also be marked up for automatic off-mesh link generation, like this:

Marking up geometry for automatic off-mesh link generation

Geometry marked up in this way will be checked during the Navmesh Baking process for creating links to other Navmesh geometry. This way, we can control the auto-generation for each GameObject. Whether an off-mesh link will be auto-generated in the baking process is also determined by the Jump distance and the Drop height properties in the Navigation Bake settings.

The NavMeshLayer assigned to auto-generated off-mesh links, is the built-in layer Jump. This allows for global control of the auto-generated off-mesh links costs (see Navmesh layers).

Note, that there is also a possibility for setting up manual off-mesh links (described here).

Page last updated: 2012-04-24



Navmesh Baking

Once the Navmesh geometry and layers are marked up, it's time to bake the Navmesh geometry.

Inside the Navigation window (Window->Navigation), go to the Bake tab (the upper-right corner), and click on the Bake button (the lower-right corner).

Navigation Bake Window

Here are the properties that affect Navmesh baking:

Radiusradius of the "typical" agent (preferrably the smallest).
Heightheight of the "typical" agent (the "clearance" needed to get a character through).
Max Slopeall surfaces with higher slope than this, will be discarded.
Step heightthe height difference below which navmesh regions are considered connected.
Drop heightIf the value of this property is positive, off-mesh links will be placed for adjacent navmesh surfaces where the height difference is below this value.
Jump distanceIf the value of this property is positive, off-mesh links will be placed for adjacent navmesh surfaces where the horizontal distance is below this value.
Advanced
Min region areaRegions with areas below this threshold will be discarded.
Width inaccuracy %Allowable width inaccuracy
Height inaccuracy %Allowable height inaccuracy
Height meshIf this options is on, original height information is stored. This has performance implications for speed and memory usage.

Note that the baked navmesh is part of the scene and agents will be able to traverse it. To remove the navmesh, click on Clear when you're in the Bake tab.

(back to Navigation and Pathfinding)

Page last updated: 2012-04-24



Sound

オーディオ リスナー

Audio Listener は、マイクのような機器として機能します。 これは、シーン内で所定の Audio Source 空の入力を受信し、コンピュータのスピーカーを通じて音声を再生します。 ほとんどのアプリケーションで、メインの Camera にリスナーを追加するのは最も有用です。 オーディオ リスナーが Reverb Zone の境界内にある場合、シーン内のすべての可聴音声に反響が適用されます。 (PRO のみ) さらに、Audio Effects をリスナーに適用でき、シーン内のすべての可聴音声に適用されます。


メイン カメラに追加されたオーディオ リスナー

プロパティ

オーディオ リスナーにはプロパティはありません。 作業するために追加する必要があるだけです。 デフォルトでは、常にメイン カメラに追加されます。

詳細

オーディオ リスナーは、Audio Sources と連携し、ゲームのための聴覚体験を作成できます。 オーディオ リスナーがシーン内で GameObject に追加されると、リスナーに十分近いソースが選択され、コンピュータのスピーカーに出力されます。 各シーンで適切に機能できるオーディオ リスナーは 1 つだけです 。

ソースが 3D の場合 (Audio Clip でのインポート設定を参照)、リスナーは、3D ワールドにおける音声の位置や、速度、方向をエミュレートします (Audio Source で、減衰レベルや 3D/2D の動作を微調整できます)。 2D は、3D 処理を無視します。 例えば、街を歩いているキャラクターがナイト クラブに入った場合、ナイト クラブの音楽はおそらく 2D で、クラブ内の個々のキャラクターの声は、Unity が扱う現実的な位置づけにより、モノになっているはずです。

オーディオ リスナーをメイン カメラまたはプレイヤーを表す GameObject のいずれかに追加する必要があります。 両方試して、ゲームに最適な方を見つけてください。

ヒント

オーディオ ソース

Audio Source は、シーンで Audio Clip を再生します。 オーディオ クリップが 3D クリップの場合、ソースは、所定の位置で再生され、距離が離れると弱まります。 オーディオはスピーカー間で広がり (ステレオ - 7.1)(「スプレッド」)、3D と 2D 間で変わります (「パン レベル」)。 これは、フォールオフ曲線で長距離で制御できます。 また、listener が 1 時間以内または、複数の Reverb Zones である場合、反響がソースに適用されます。 (PRO のみ) より鮮やかなオーディオを得るのに、個々のフィルターを各オーディオ ソースに適用できます。 詳細については、Audio Effects を参照してください。


Scene View でのオーディオ ソース機器と inspector でのその設定。」

!!プロパティ

Audio Clip再生される音声クリップを参照します。
Mute有効にすると、音声は再生されますが、ミュートになります。.
Bypass Effectsオーディオ ソースに適用されるフィルター効果を素早く「バイパス」します。 簡単にすべての効果をオン・オフできます。
Play On Awake有効にすると、シーン開始時に音声の再生が始まります。 無効にすると、スクリプティングから「Play()」を使用して開始する必要があります。
Loop有効にすると、最後に達した時に、「オーディオ クリップ」がループします。
Priorityシーンにある他のすべてのオーディオ ソースの間でこのオーディオ ソースの優先度を決定します。 (優先度: 0 = 最重要、 256 = 最も重視しない、 デフォルト = 128)。 音楽トラックが時々スワップアウトするのを避けるには、0 を使用します。
VolumeAudio Listener からの 1 つの世界単位 (1 メートル) ノ距離での音声の大きさ。
Pitch「オーディオ クリップ」での減速 / 加速によるピッチの変化量。 1 が普通の再生速度になります。
3D Sound Settings3D 音声の場合にオーディオ ソースに適用される設定。
Pan Level3D エンジンがオーディオ ソースに与える効果の程度を設定します。
Spreadスピーカー空間内の 3D ステレオまたはマルチチャンネル音声に広がり角を設定します。
Doppler Levelこのオーディオ ソースに適用するドップラー効果の量を決定します (0 に設定すると、効果は適用されません)。
Min DistanceMinDistance 内は、音声は最も低くなります。 MinDistance 外では、音声が弱まり始めます。 音声の MinDistance を増やすと、3D ワールドでの音声が「大きく」なります。減らすと、3D ワールドでの音声が「小さく」なります。
Max Distance音が弱まるのを止める距離。 この地点を超えると、リスナーからの MaxDistance 単位で到達する音量を維持し、これ以上下がることはありません。
Rolloff Mode音声がフェードする速度。 この値が高いほど、音声を聞く前にリスナーがより近づく必要があります (グラフによって決定されます)。
Logarithmic Rolloffオーディオ ソースに近づくと音声が大きくなりますが、オブジェクトから離れると、かなりの速さで低くなります。
Linear Rolloffオーディオ ソースから離れるほど、聞こえにくくなります。
Custom Rolloffオーディオ ソースからの音声が、ロール オフのグラフの設定状態に応じて動作します。
2D Sound Settings3D 音声の場合にオーディオ ソースに適用される設定。
Pan 2D3D エンジンがオーディオ ソースに与える効果の程度を設定します。

Rolloff の種類

3つの Rolloff モードには、 対数、直線およびカスタム Rolloff があります。 カスタム Rolloff は、後述するように、音量距離曲線を修正することで修正できます。 対数または直線に設定する際に、音量距離関数を修正しようとすると、自動的にカスタム Rolloff.に変わります。


「オーディオ ソースが持つことができる Rolloff モード。」

距離関数

オーディオ ソースとオーディオ リスナー間の距離の関数として修正できる音声のプロパティがいくつかあります。

Volume: 距離に対する振幅 (0.0 - 1.0)。
Pan: 距離に対する左 (-1.0) 対右 (1.0)。
Spread: 距離に対する角度 (0.0 - 360.0°)。
Low-Pass (オーディオ ソースにローパスフィルタが追加されている場合のみ): 距離に対するカットオフ頻度 (22000.0 - 360.0°)。


「音量、パン、拡散、ローパス音声フィルタのための距離関数。 オーディオ リスナーまでの現在の距離にグラフで印が付けられます。」

距離関数を修正するには、曲線を直接編集します。 詳細については、Editing Curves を参照してください。

オーディオ ソースの作成

オーディオ ソースは、割り当てられた「オーディオ クリップ」がないと、何もしません。 クリップは、再生される実際の音声ファイルです。 ソースは、そのクリップの再生を開始・停止したり、その他のオーディオ プロパティを修正するためのコントローラのようなものです。

オーディオ ソースの新規作成

  1. Unity プロジェクトにオーディオ ファイルをインポートします。 これらがオーディオ クリップになります。
  2. メニューバーから GameObject->Create Empty に移動します。
  3. 新しい GameObject を選択して、Component->Audio->Audio Source を選択します。
  4. インスペクタでオーディオ ソース コンポーネントの「オーディオ クリップ」プロパティを割り当てます。

注意: Assets フォルダにある 1 つの「オーディオ クリップ」に対してのみオーディオ ソースを作成したい場合、その「オーディオ クリップ」をシーン ビューにドラッグ & ドロップすると、そのシーン ビューに対して、「オーディオ ソース」ゲーム オブジェクトが自動的に作成されます。オーディオクリップをすでにあるゲームオブジェクトにドラッグ&ドロップすると、オーディオクリップをと新しいオーディオソースをアタッチする(すでにオーディオソースがなかった場合)ことになります。もしオブジェクトにオーディオソースがあった場合、ドラッグ&ドロップすることで、新しいオーディオクリップはすでにあるオーディオソースを上書きします。

プラットフォーム固有の詳細

iOS

携帯プラットフォーム上では、より高速に解凍するため、圧縮オーディオは MP3 として符号化されます。 この圧縮により、クリップの最後のサンプルが削除され、「完全ループ」のクリップを破壊する可能性があることに注意してください。 サンプルのクリッピングを避けるため、クリップが MP3 サンプルの境界にあることを確認してください (これを行うためのツールが広く利用可能です)。 パフォーマンス上の理由から、オーディオ クリップは、Apple ハードウェア コーデックを使用して再生できます。 これを有効にするには、インポート設定の「ハードウェアを使用する」チェックボックスにチェックを入れます。 詳細については、Audio Clip を参照してください。

Android

携帯プラットフォーム上では、より高速に解凍するため、圧縮オーディオは MP3 として符号化されます。 この圧縮により、クリップの最後のサンプルが削除され、「完全ループ」のクリップを破壊する可能性があることに注意してください。 サンプルのクリッピングを避けるため、クリップが MP3 サンプルの境界にあることを確認してください (これを行うためのツールが広く利用可能です)。

オーディオ クリップ

Audio Clip は、Audio Source によって使用されるオーディオ データです。 Unity は、モノ、ステレオおよびマルチ チャンネル (8 つまで) のオーディオ アセットをサポートしています。 Unity は、次のオーディオ ファイル形式をサポートしています。 .aif.wav.mp3.oggおよび次の トラッカー モジュール ファイル形式: .xm.mod.itおよび .s3m 。 トラッカー モジュール アセットは、波形プレビューをアセット インポート インスペクタにレンダリングできないこと以外は、Unity のその他のオーディオ アセットと同じ働きをします。


「オーディオ クリップ Inspector

プロパティ

Audio Formatランタイム時に音声に使用される特定の形式。
Nativeファイル サイズが大きくなるにつれ、品質が高くなります。 非常に短い音響効果に最適です。
Compressedファイル サイズが小さくなるにつれ、品質が低くなるか、変わりやすくなります。 中程度の長さの音響効果や音楽に最適です。
3D Sound有効にすると、3D スペースで音声が再生されます。 モノとステレオの音声の両方を 3D で再生できます。
Force to mono有効にすると、オーディオ クリップが 1 つのチャンネル音声にダウンミックスされます。
Load TypeUnity がランタイムで音声をロードする方法。
Decompress on loadロード時に音声を解凍します。 オン ザ フライの解凍の性能オーバーヘッドを回避するため、より小さい圧縮音声に使用します。 ロード時の音声の解凍では、メモリ内で圧縮状態を維持する場合の 10 倍以上のメモリを使用するため、大きなファイルには使用しないでください。
Compressed in memoryメモリ内で圧縮状態を維持し、再生時には解凍します。 若干の性能オーバーヘッドが生じるため (Ogg/Vorbis 圧縮ファイルの esp.)、大きいファイルにのみ使用してください。技術的な制約により、このオプションはFMODオーディオを使用するプラットフォーム上でOgg Vorbisについて”Steam From Disc”(下記参照)に切り換わることに注意してください。
Stream from discディスクから直接オーディオ データを流します。これは、メモリの元の音声サイズの一部を使用します。 音楽や非常に長いトラックに使用してください。 一般的に、ハードウェアに応じて、1 ~ 2 の同時ストリームに抑えてください。
Compression「圧縮」クリップに適用される圧縮の量。 ファイル サイズに関する統計はスライダの下で確認できます。 スライダをドラッグして、再生を「十分良好」な状態にすべきですが、ファイルや配布上のニーズに見合うよう、十分小さいサイズにしてください。
Hardware Decoding(iOS のみ) iOS 機器上の圧縮オーディオに使用できます。 解凍時の CPU への負担を減らすため、Apple のハードウェア デコーダを使用します。 詳細については、プラットフォーム固有の詳細を確認してください。
Gapless looping(Android/iOS のみ) 完全ループのオーディオ ソース ファイル (非圧縮 PCM 形式) を圧縮する際に、そのループを残すために使用します。 標準の MPEG エンコーダは、ループ点周辺にサイレンスを取り込んでいますが、これはちょっとした「クリック」または「ポップ」として再生します。 Unity ではこれは円滑に扱われます。

オーディオ アセットのインポート

Unity は「圧縮」と「ネイティブ」オーディオの両方をサポートしています。 どのファイルも (MP3/Ogg Vorbis を除く) 最初は「ネイティブ」としてインポートされます。 ゲーム稼働中、圧縮オーディオ ファイルは CPU によって解凍される必要がありますが、ファイル サイズは小さくなります。 「Stream」にチェックを入れると、オーディオは「オン ザ フライ」で解凍されるか、そうでない場合は、オーディオはロード時に全体的に解凍されます。 ネイティブの PCM 形式 (WAV、AIFF) には CPU への負担を増やすことなく、高い忠実性があるという利点がありますが、作成されるファイルのサイズははるかに大きくなります。 モジュール ファイル (.mod、.it、.s3m..xm) は、極めて低いフットプリントで非常に高い音質を提供できます。

一般的に、「圧縮」オーディオ (またはモジュール) は、BGM や会話などの長いファイルに最適で、非圧縮オーディオは、短い音響効果により適しています。 高圧縮から始めて、圧縮スライダで圧縮の量を弱め、音質の差が著しくなる前後で適切に微調整します。

3D オーディオの使用

オーディオ クリップに「3D 音声」と表示されている場合、このクリップは、ゲームの世界の 3D スペースでの位置をシミュレートするために再生されます。 3D 音声は、音量を減らし、スピーカー間でパンすることで、音声の距離や位置をエミュレートします。 モノとマルチ チャンネルの音声の両方を 3D に配置できます。 マルチ チャンネル オーディオの場合、Audio Source の「Spread」オプションを使用して、スピーカー スペースで個々のチャンネルを拡散および分割します。 Unity は、3D スペースでのオーディオ の動作を制御および微調整するための各種オプションを提供しています。 Audio Source を参照してください。

プラットフォーム固有の詳細

iOS

携帯プラットフォーム上では、解凍時の CPU の負担を減らするため、圧縮オーディオは MP3 として符号化されます。

パフォーマンス上の理由から、オーディオ クリップは、Apple ハードウェア コーデックを使用して再生できます。 これを有効にするには、オーディオ インポータの「ハードウェア デコーディング」チェックボックスにチェックを入れます。 バックグラウンドの iPod オーディオを含む、ハードウェア オーディオ ストリームは 1 回につき、1 つしか回答できません。

ハードウェア デコーダを使用できない場合は、解凍はソフトウェア デコーダで行われます (iPhone 3GS 以降では、Apple のソフトウェア デコーダが Unity(FMOD) 自身のデコーダ上で使用されます)。

Android

携帯プラットフォーム上では、解凍時の CPU の負担を減らするため、圧縮オーディオは MP3 として符号化されます。

Page last updated: 2007-11-16



Game Interface Elements

Unity はいくつものゲーム制作のためのグラフィックユーザーインターフェース(GUI)を提供します。

 GUI TextGUI Texture オブジェクトを使用するか、UnityGUI を使ったスクリプトからインターフェースを生成することで利用出来ます。

このページの残りには、UnityGUI で起動および実行するための詳細なガイドが含まれています。

GUI スクリプティング ガイド

概要

UnityGUI により、機能を備えた多様な GUI を非常に素早く、簡単に作成できます。 GUI オブジェクトを作成し、手動で配置し、その機能を処理するスクリプトを記述する代わりに、このすべてを小さいコードで一度に行います。 これは、GUI Controlsのインスタンス化、配置、定義、の全てを一度の関数呼び出しで行います。

例えば、次のコードにより、エディタやその他で作業することなく、ボタンの作成とハンドリングが出来ます:

// JavaScript
function OnGUI () {
	if (GUI.Button (Rect (10,10,150,100), "I am a button")) {
		print ("You clicked the button!");
	}
}


// C#
using UnityEngine;
using System.Collections;

public class GUITest : MonoBehaviour {

	void OnGUI () {
		if (GUI.Button (new Rect (10,10,150,100), "I am a button")) {
			print ("You clicked the button!");
		}
	}
}

前述のコードで作成されたボタン

この例は非常に簡単ですが、UnityGUI で使用できる非常に強力で、複雑な手法があります。GUI構築は幅広いテーマですが、次の項で、できる限り早くスピードに乗れるお手伝いをします。本ガイドはそのまま読むか、参考資料として活用できます。

UnityGUI Basics

このセクションでは、UnityGUIの最も重要なコンセプトをカバーし、プロジェクトにコピー&ペーストできるいくつかのサンプルの他、概要を提供します。 UnityGUI は、非常に扱いやすいため、始めるにはこれは良い場所です。

Controls

このセクションでは、UnityGUI で使用できるすべてのコントロールを一覧表示します。 本項では、コードのサンプルと処理結果の画像を提供します。

Customization

GUIの外観をゲームの雰囲気にあわせて変更するのは大事なことです。UnityGUIでのコントロールはすべてGUIStylesGUISkinsでカスタマイズ出来、このセクションでは、その使用方法について説明します。

Layout Modes

UnityGUIには、GUIを配置するための2つの方法があります。画面上で各コントロールを配置するか、HTMLテーブルと同じような働きをする自動レイアウトシステムを使用できます。どちらのシステムも好きに使用することが可能であり、2つを自由にミックスすることもできます。このセクションでは、例を含む、2つのシステム間の機能的差を説明します。

Extending UnityGUI

UnityGUIは、新しいコントロールのタイプで非常に簡単に拡張できます。 この章では、Unityのイベントシステムへの統合を備えた、簡単な「複合」コントロールを作成する方法を示します。

Extending Unity Editor

UnityエディタのGUIは、実際にUnityGUIを使用して記述されています。すなわち、ゲーム上のGUIに使用するコードと同等の方法で、エディタも完全に拡張できます。 また、カスタムのエディタ GUI作成時に便利な、エディタ固有のGUIウィジェットがいくつかあります。

Page last updated: 2012-11-13



Networked Multiplayer

Realtime networking is a complex field but Unity makes it easy to add networking features to your game. Nevertheless, it is useful to have some idea of the scope of networking before using it in a game. This section explains the fundamentals of networking along with the specifics of Unity's implementation. If you have never created a network game before then it is strongly recommended that you work through this guide before getting started.

High Level Overview

This section outlines all the concepts involved in networking and serves as an introduction to deeper topics.

Networking Elements in Unity

This section of the guide covers Unity's implementation of the concepts explained in the overview.

RPC Details

Remote Procedure Call or RPC is a way of calling a function on a remote machine. This may be a client calling a function on the server, or the server calling a function on some or all clients. This section explains RPC concepts in detail.

State Synchronization

State Synchronization is a method of regularly updating a specific set of data across two or more game instances running on the network.

Minimizing Bandwidth

Every choice you make about where and how to share data will affect the network bandwidth your game uses. This page explains how bandwidth is used and how to keep usage to a minimum.

Network View

Network Views are Components you use to share data across the network and are a fundamental aspect of Unity networking. This page explains them in detail.

Network Instantiate

A complex subject in networking is ownership of an object and determination of who controls what. Network Instantiation handles this task for you, as explained in this section. Also covered are some more sophisticated alternatives for situations where you need more control over object ownership.

Master Server

The Master Server is like a game lobby where servers can advertise their presence to clients. It can also enable communication from behind a firewall or home network using a technique called NAT punchthrough (with help from a facilitator) to make sure your players can always connect with each other. This page explains how to use the Master Server.

Page last updated: 2011-11-18



iphone-GettingStarted

Building games for devices like the iPhone and iPad requires a different approach than you would use for desktop PC games. Unlike the PC market, your target hardware is standardized and not as fast or powerful as a computer with a dedicated video card. Because of this, you will have to approach the development of your games for these platforms a little differently. Also, the features available in Unity for iOS differ slightly from those for desktop PCs.

Setting Up Your Apple Developer Account

Before you can run Unity iOS games on the actual device, you will need to have your Apple Developer account approved and set up. This includes establishing your team, adding your devices, and finalizing your provisioning profiles. All this setup is performed through Apple's developer website. Since this is a complex process, we have provided a basic outline of the tasks that must be completed before you can run code on your iOS devices. However, the best thing to do is follow the step-by-step instructions at Apple's iPhone Developer portal.

Note: We recommend that you set up your Apple Developer account before proceeding because you will need it to use Unity to its full potential with iOS.

Accessing iOS Functionality

Unity provides a number of scripting APIs to access the multi-touch screen, accelerometer, device geographical location system and much more. You can find out more about the script classes on the iOS scripting page.

Exposing Native C, C++ or Objective-C Code to Scripts

Unity allows you to call custom native functions written in C, C++ or Objective-C directly from C# scripts. To find out how to bind native functions, visit the plugins page.

Prepare Your Application for In-App Purchases

The Unity iOS runtime allows you to download new content and you can use this feature to implement in-app purchases. See the downloadable content manual page for further information.

Occlusion Culling

Unity supports occlusion culling which is useful for squeezing high performance out of complex scenes with many objects. See the occlusion culling manual page for further information.

Splash Screen Customization

See the splash screen customization page to find out how to change the image your game shows while launching.

Troubleshooting and Reporting Crashes.

If you are experiencing crashes on the iOS device, please consult the iOS troubleshooting page for a list of common issues and solutions. If you can't find a solution here then please file a bug report for the crash (menu: Help > Report A Bug in the Unity editor).

How Unity's iOS and Desktop Targets Differ

Statically Typed JavaScript

Dynamic typing in JavaScript is always turned off in Unity when targetting iOS (this is equivalent to #pragma strict getting added to all your scripts automatically). Static typing greatly improves performance, which is especially important on iOS devices. When you switch an existing Unity project to the iOS target, you will get compiler errors if you are using dynamic typing. You can easily fix these either by using explicitly declared types for the variables that are causing errors or taking advantage of type inference.

MP3 Instead of Ogg Vorbis Audio Compression

For performance reasons, MP3 compression is favored on iOS devices. If your project contains audio files with Ogg Vorbis compression, they will be re-compressed to MP3 during the build. Consult the audio clip documentation for more information on using compressed audio on the iPhone.

PVRTC Instead of DXT Texture Compression

Unity iOS does not support DXT textures. Instead, PVRTC texture compression is natively supported by iPhone/iPad devices. Consult the texture import settings documentation to learn more about iOS texture formats.

Movie Playback

MovieTextures are not supported on iOS. Instead, full-screen streaming playback is provided via scripting functions. To learn about the supported file formats and scripting API, consult the movie page in the manual.

Further Reading

Page last updated: 2012-06-06



iphone-basic

This section covers the most common and important questions that come up when starting to work with iOS.

Prerequisites

I've just received iPhone Developer approval from Apple, but I've never developed for iOS before. What do I do first?

A: Download the SDK, get up and running on the Apple developer site, and set up your team, devices, and provisioning. We've provided a basic list of steps to get you started.

Can Unity-built games run in the iPhone Simulator?

A: No, but Unity iOS can build to iPad Simulator if you're using the latest SDK. However the simulator itself is not very useful for Unity because it does not simulate all inputs from iOS or properly emulate the performance you get on the iPhone/iPad. You should test out gameplay directly inside Unity using the iPhone/iPad as a remote control while it is running the Unity Remote application. Then, when you are ready to test performance and optimize the game, you publish to iOS devices.

Unity Features

How do I work with the touch screen and accelerometer?

A: In the scripting reference inside your Unity iOS installation, you will find classes that provide the hooks into the device's functionality that you will need to build your apps. Consult the Input System page for more information.

My existing particle systems seem to run very slowly on iOS. What should I do?

A: iOS has relatively low fillrate. If your particles cover a rather large portion of the screen with multiple layers, it will kill iOS performance even with the simplest shader. We suggest baking your particle effects into a series of textures off-line. Then, at run-time, you can use 1-2 particles to display them via animated textures. You can get fairly decent looking effects with a minimum amount of overdraw this way.

Can I make a game that uses heavy physics?

A: Physics can be expensive on iOS is it requires a lot of floating point number crunching. You should completely avoid MeshColliders if at all possible, but they can be used if they are really necessary. To improve performance, use a low fixed framerate using Edit->Time->Fixed Delta Time. A framerate of 10-30 is recommended. Enable rigidbody interpolation to achieve smooth motion while using low physics frame rates. In order to achieve completely fluid framerate without oscillations, it is best to pick fixed deltaTime value based on the average framerate your game is getting on iOS. Either 1:1 or half the frame rate is recommended. For example, if you get 30 fps, you should use 15 or 30 fps for fixed frame rate (0.033 or 0.066)

Can I access the gallery, music library or the native iPod player in Unity iOS?

A: Yes - if you implement it. Unity iPhone supports the native plugin system, where you can add any feature you need -- including access to Gallery, Music library, iPod Player and any other feature that the iOS SDK exposes. Unity iOS does not provide an API for accessing the listed features through Unity scripts.

UnityGUI Considerations

What kind of performance impact will UnityGUI make on my games?

A: UnityGUI is fairly expensive when many controls are used. It is ideal to limit your use of UnityGUI to game menus or very minimal GUI Controls while your game is running. It is important to note that every object with a script containing an OnGUI() call will require additional processor time -- even if it is an empty OnGUI() block. It is best to disable any scripts that have an OnGUI() call if the GUI Controls are not being used. You can do this by marking the script as enabled = false.

Any other tips for using UnityGUI?

A: Try using GUILayout as little as possible. If you are not using GUILayout at all from one OnGUI() call, you can disable all GUILayout rendering using MonoBehaviour.useGUILayout = false; This doubles GUI rendering performance. Finally, use as few GUI elements while rendering 3D scenes as possible.

Page last updated: 2011-10-30



unity-remote

Unity Remote is an application that allows you to use your iOS device as a remote control for your project in Unity. This is useful during development since it is much quicker to test your project in the editor with remote control than to build and deploy it to the device after each change.

Where can I find Unity Remote?

Unity remote is available for download from the AppStore at no charge. If you prefer to build and deploy the application yourself, you can download the source here at the Unity website.

How do I build Unity Remote?

First, download the project source code here and unzip it to your preferred location. The zip file contains an XCode project to build Unity Remote and install it on your device.

Assuming you have already created the provisioning profile and successfully installed iOS builds on your device, you just need to open the Xcode project file UnityRemote.xcodeproj. Once XCode is launched, you should click "Build and Go" to install the app on your iOS device. If you have never built and run applications before, we recommend that you try building some of the Apple examples first to familiarize yourself with XCode and iOS.

Once Unity Remote is installed, make sure your device is connected via Wi-Fi to the same network as your development machine. Launch Unity Remote on your iPhone/iPad while Unity is running on your computer and select your computer from the list that appears. Now, whenever you enter Play mode in the Editor, your device will act as a remote control that you can use for developing and testing your game. You can control the application with the device wirelessly and you will also see a low-res version of the app on the device's screen.

Note: The Unity iOS editor cannot emulate the device's hardware perfectly, so you may not get the exact behavior (graphics performance, touch responsiveness, sounds playback, etc) that you would on a real device.

Xcode shows strange errors while deploying Unity Remote to my device. What should I do?

This indicates that the default Identifier in the Unity Remote project is not compatible with your provisioning profile. You will have to alter this Identifier manually in your XCode project. The Identifier must match your provisioning profile.

You will need to create an AppID with an trailing asterisk if you have not already done so; you can do this in the Program Portal on Apple's iPhone Developer Program. First, go to the Program Portal and choose the AppIDs tab. Then, click the Add ID button in the top right corner and type your usual bundle identifier followed by dot and asterisk (eg, com.mycompany.*) in the App ID Bundle Seed ID and Bundle Identifier field. Add the new AppID to your provisioning profile, then download and reinstall it. Don't forget to restart Xcode afterwards. If you have any problems creating the AppID, consult the Provisioning How-to section on Apple's website.


Don't forget to change the Identifier before you install Unity Remote on your device.

Open the Unity Remote project with XCode. From the menu, select Project->Edit Active Target "Unity Remote". This will open a new window entitled Target "Unity Remote" Info. Select the Properties tab. Change the Identifier property field from com.unity3d.UnityRemote to the bundle identifier in your AppID followed by "." (dot) followed by "UnityRemote". For example, if your provisioning profile contains ##.com.mycompany.* AppID, then change the Identifier field to com.mycompany.UnityRemote.

Next, select Build->Clean all targets from the menu, and compile and install Unity Remote again. You may also need to change the active SDK from Simulator to Device - 2.0 | Release. There is no problem using SDK 2.0 even if your device runs a newer version of the OS.

I'm getting really poor graphics quality when running my game in Unity Remote. What can I do to improve it?

When you use Unity Remote, the game actually runs on your Mac while its visual content is heavily compressed and streamed to the device. As a result, what you see on the device screen is just a low-res version of what the app would really look like. You should check how the game runs on the device occasionally by building and deploying the app (select File->Build & Run in the Unity editor).

Unity Remote is laggy. Can I improve it?

The performance of Unity Remote depends heavily on the speed of the Wi-Fi network, the quality of the networking hardware and other factors. For the best experience, create an ad-hoc network between your Mac and iOS device. Click the Airport icon on your Mac and choose "Create Network". Then, enter a name and password and click OK. On the device, choose Settings->Wi-Fi and select the new Wi-Fi network you have just created. Remember that an ad-hoc network is really a wireless connection that does not involve a wireless access point. Therefore, you will usually not have internet access while using ad-hoc networking.

Turning Bluetooth off on both on your iPhone/iPad and on Mac should also improve connection quality.

If you do not need to see the game view on the device, you can turn image synchronization off in the Remote machine list. This will reduce the network traffic needed for the Remote to work.

The connection to Unity Remote is easily lost

This can be due to a problem with the installation or other factors that prevent Unity Remote from functioning properly. Try the following steps in sequence, checking if the performance improves at each step before moving on to the next:-

  1. First of all, check if Bluetooth is switched on. Both your Mac and iOS device should have Bluetooth disabled for best performance.
  2. Delete the settings file located at ~/Library/Preferences/com.unity3d.UnityEditoriPhone.plist
  3. Reinstall the game on your iPhone/iPad.
  4. Reinstall Unity on your Mac.
  5. As a last resort, performing a hard reset on the iOS device can sometimes improve the performance of Unity Remote.

If you still experience problems then try installing Unity Remote on another device (in another location if possible) and see if it gives you better results. There could be problems with RF interference or other software influencing the performance of the wireless adapter on your Mac or iOS device.

Unity Remote doesn't see my Mac. What should I do?

Page last updated: 2012-08-29



iphone-API

Most features of the iOS devices are exposed through the Input and Handheld classes. For cross-platform projects, UNITY_IPHONE is defined for conditionally compiling iOS-specific C# code.

Further Reading

Page last updated: 2012-11-26



iphone-Input

Desktop

Note: Keyboard, joystick and gamepad input work on the desktop versions of Unity (including webplayer and Flash) but not on mobiles.

Unity supports keyboard, joystick and gamepad input.

Virtual axes and buttons can be created in the Input Manager, and end users can configure Keyboard input in a nice screen configuration dialog.

You can setup joysticks, gamepads, keyboard, and mouse, then access them all through one simple scripting interface.

From scripts, all virtual axes are accessed by their name.

Every project has the following default input axes when it's created:

  • Horizontal and Vertical are mapped to w, a, s, d and the arrow keys.
  • Fire1, Fire2, Fire3 are mapped to Control, Option (Alt), and Command, respectively.
  • Mouse X and Mouse Y are mapped to the delta of mouse movement.
  • Window Shake X and Window Shake Y is mapped to the movement of the window.

Adding new Input Axes

If you want to add new virtual axes go to the Edit->Project Settings->Input menu. Here you can also change the settings of each axis.

You map each axis to two buttons on a joystick, mouse, or keyboard keys.

NameThe name of the string used to check this axis from a script.
Descriptive NamePositive value name displayed in the input tab of the Configuration dialog for standalone builds.
Descriptive Negative NameNegative value name displayed in the Input tab of the Configuration dialog for standalone builds.
Negative ButtonThe button used to push the axis in the negative direction.
Positive ButtonThe button used to push the axis in the positive direction.
Alt Negative ButtonAlternative button used to push the axis in the negative direction.
Alt Positive ButtonAlternative button used to push the axis in the positive direction.
GravitySpeed in units per second that the axis falls toward neutral when no buttons are pressed.
DeadSize of the analog dead zone. All analog device values within this range result map to neutral.
SensitivitySpeed in units per second that the the axis will move toward the target value. This is for digital devices only.
SnapIf enabled, the axis value will reset to zero when pressing a button of the opposite direction.
InvertIf enabled, the Negative Buttons provide a positive value, and vice-versa.
TypeThe type of inputs that will control this axis.
AxisThe axis of a connected device that will control this axis.
Joy NumThe connected Joystick that will control this axis.

Use these settings to fine tune the look and feel of input. They are all documented with tooltips in the Editor as well.

Using Input Axes from Scripts

You can query the current state from a script like this:

value = Input.GetAxis ("Horizontal");

An axis has a value between -1 and 1. The neutral position is 0. This is the case for joystick input and keyboard input.

However, Mouse Delta and Window Shake Delta are how much the mouse or window moved during the last frame. This means it can be larger than 1 or smaller than -1 when the user moves the mouse quickly.

It is possible to create multiple axes with the same name. When getting the input axis, the axis with the largest absolute value will be returned. This makes it possible to assign more than one input device to one axis name. For example, create one axis for keyboard input and one axis for joystick input with the same name. If the user is using the joystick, input will come from the joystick, otherwise input will come from the keyboard. This way you don't have to consider where the input comes from when writing scripts.

Button Names

To map a key to an axis, you have to enter the key's name in the Positive Button or Negative Button property in the Inspector.

The names of keys follow this convention:

  • Normal keys: "a", "b", "c" ...
  • Number keys: "1", "2", "3", ...
  • Arrow keys: "up", "down", "left", "right"
  • Keypad keys: "[1]", "[2]", "[3]", "[+]", "[equals]"
  • Modifier keys: "right shift", "left shift", "right ctrl", "left ctrl", "right alt", "left alt", "right cmd", "left cmd"
  • Mouse Buttons: "mouse 0", "mouse 1", "mouse 2", ...
  • Joystick Buttons (from any joystick): "joystick button 0", "joystick button 1", "joystick button 2", ...
  • Joystick Buttons (from a specific joystick): "joystick 1 button 0", "joystick 1 button 1", "joystick 2 button 0", ...
  • Special keys: "backspace", "tab", "return", "escape", "space", "delete", "enter", "insert", "home", "end", "page up", "page down"
  • Function keys: "f1", "f2", "f3", ...

The names used to identify the keys are the same in the scripting interface and the Inspector.

value = Input.GetKey ("a");

Mobile Input

On iOS and Android, the Input class offers access to touchscreen, accelerometer and geographical/location input.

Access to keyboard on mobile devices is provided via the iOS keyboard.

Multi-Touch Screen

The iPhone and iPod Touch devices are capable of tracking up to five fingers touching the screen simultaneously. You can retrieve the status of each finger touching the screen during the last frame by accessing the Input.touches property array.

Android devices don't have a unified limit on how many fingers they track. Instead, it varies from device to device and can be anything from two-touch on older devices to five fingers on some newer devices.

Each finger touch is represented by an Input.Touch data structure:

fingerIdThe unique index for a touch.
positionThe screen position of the touch.
deltaPositionThe screen position change since the last frame.
deltaTimeAmount of time that has passed since the last state change.
tapCountThe iPhone/iPad screen is able to distinguish quick finger taps by the user. This counter will let you know how many times the user has tapped the screen without moving a finger to the sides. Android devices do not count number of taps, this field is always 1.
phaseDescribes so called "phase" or the state of the touch. It can help you determine if the touch just began, if user moved the finger or if he just lifted the finger.

Phase can be one of the following:

BeganA finger just touched the screen.
MovedA finger moved on the screen.
StationaryA finger is touching the screen but hasn't moved since the last frame.
EndedA finger was lifted from the screen. This is the final phase of a touch.
CanceledThe system cancelled tracking for the touch, as when (for example) the user puts the device to her face or more than five touches happened simultaneously. This is the final phase of a touch.

Following is an example script which will shoot a ray whenever the user taps on the screen:

var particle : GameObject;
function Update () {
	for (var touch : Touch in Input.touches) {
		if (touch.phase == TouchPhase.Began) {
			// Construct a ray from the current touch coordinates
			var ray = Camera.main.ScreenPointToRay (touch.position);
			if (Physics.Raycast (ray)) {
				// Create a particle if hit
				Instantiate (particle, transform.position, transform.rotation);
			}
		}
	}
}

Mouse Simulation

On top of native touch support Unity iOS/Android provides a mouse simulation. You can use mouse functionality from the standard Input class.

Device Orientation

Unity iOS/Android allows you to get discrete description of the device physical orientation in three-dimensional space. Detecting a change in orientation can be useful if you want to create game behaviors depending on how the user is holding the device.

You can retrieve device orientation by accessing the Input.deviceOrientation property. Orientation can be one of the following:

UnknownThe orientation of the device cannot be determined. For example when device is rotate diagonally.
PortraitThe device is in portrait mode, with the device held upright and the home button at the bottom.
PortraitUpsideDownThe device is in portrait mode but upside down, with the device held upright and the home button at the top.
LandscapeLeftThe device is in landscape mode, with the device held upright and the home button on the right side.
LandscapeRightThe device is in landscape mode, with the device held upright and the home button on the left side.
FaceUpThe device is held parallel to the ground with the screen facing upwards.
FaceDownThe device is held parallel to the ground with the screen facing downwards.

Accelerometer

As the mobile device moves, a built-in accelerometer reports linear acceleration changes along the three primary axes in three-dimensional space. Acceleration along each axis is reported directly by the hardware as G-force values. A value of 1.0 represents a load of about +1g along a given axis while a value of -1.0 represents -1g. If you hold the device upright (with the home button at the bottom) in front of you, the X axis is positive along the right, the Y axis is positive directly up, and the Z axis is positive pointing toward you.

You can retrieve the accelerometer value by accessing the Input.acceleration property.

The following is an example script which will move an object using the accelerometer:

var speed = 10.0;
function Update () {
	var dir : Vector3 = Vector3.zero;

	// we assume that the device is held parallel to the ground
	// and the Home button is in the right hand

	// remap the device acceleration axis to game coordinates:
	//  1) XY plane of the device is mapped onto XZ plane
	//  2) rotated 90 degrees around Y axis
	dir.x = -Input.acceleration.y;
	dir.z = Input.acceleration.x;

	// clamp acceleration vector to the unit sphere
	if (dir.sqrMagnitude > 1)
		dir.Normalize();

	// Make it move 10 meters per second instead of 10 meters per frame...
	dir *= Time.deltaTime;

	// Move object
	transform.Translate (dir * speed);
}

Low-Pass Filter

Accelerometer readings can be jerky and noisy. Applying low-pass filtering on the signal allows you to smooth it and get rid of high frequency noise.

The following script shows you how to apply low-pass filtering to accelerometer readings:

var AccelerometerUpdateInterval : float = 1.0 / 60.0;
var LowPassKernelWidthInSeconds : float = 1.0;

private var LowPassFilterFactor : float = AccelerometerUpdateInterval / LowPassKernelWidthInSeconds; // tweakable
private var lowPassValue : Vector3 = Vector3.zero;
function Start () {
	lowPassValue = Input.acceleration;
}

function LowPassFilterAccelerometer() : Vector3 {
	lowPassValue = Mathf.Lerp(lowPassValue, Input.acceleration, LowPassFilterFactor);
	return lowPassValue;
}

The greater the value of LowPassKernelWidthInSeconds, the slower the filtered value will converge towards the current input sample (and vice versa). You should be able to use the LowPassFilter() function instead of avgSamples().

I'd like as much precision as possible when reading the accelerometer. What should I do?

Reading the Input.acceleration variable does not equal sampling the hardware. Put simply, Unity samples the hardware at a frequency of 60Hz and stores the result into the variable. In reality, things are a little bit more complicated -- accelerometer sampling doesn't occur at consistent time intervals, if under significant CPU loads. As a result, the system might report 2 samples during one frame, then 1 sample during the next frame.

You can access all measurements executed by accelerometer during the frame. The following code will illustrate a simple average of all the accelerometer events that were collected within the last frame:

var period : float = 0.0;
var acc : Vector3 = Vector3.zero;
for (var evnt : iPhoneAccelerationEvent  in iPhoneInput.accelerationEvents) {
	acc += evnt.acceleration * evnt.deltaTime;
	period += evnt.deltaTime;
}
if (period > 0)
	acc *= 1.0/period;
return acc;

Further Reading

The Unity mobile input API is originally based on Apple's API. It may help to learn more about the native API to better understand Unity's Input API. You can find the Apple input API documentation here:

Note: The above links reference your locally installed iPhone SDK Reference Documentation and will contain native ObjectiveC code. It is not necessary to understand these documents for using Unity on mobile devices, but may be helpful to some!

iOS

Device geographical location

Device geographical location can be obtained via the iPhoneInput.lastLocation property. Before calling this property you should start location service updates using iPhoneSettings.StartLocationServiceUpdates() and check the service status via iPhoneSettings.locationServiceStatus. See the scripting reference for details.

Page last updated: 2012-06-28



iOS-Keyboard

In most cases, Unity will handle keyboard input automatically for GUI elements but it is also easy to show the keyboard on demand from a script.

iOS

Using the Keyboard

GUI Elements

The keyboard will appear automatically when a user taps on editable GUI elements. Currently, GUI.TextField, GUI.TextArea and GUI.PasswordField will display the keyboard; see the GUI class documentation for further details.

Manual Keyboard Handling

Use the iPhoneKeyboard.Open function to open the keyboard. Please see the iPhoneKeyboard scripting reference for the parameters that this function takes.

Keyboard Type Summary

The Keyboard supports the following types:

iPhoneKeyboardType.DefaultLetters. Can be switched to keyboard with numbers and punctuation.
iPhoneKeyboardType.ASCIICapableLetters. Can be switched to keyboard with numbers and punctuation.
iPhoneKeyboardType.NumbersAndPunctuationNumbers and punctuation. Can be switched to keyboard with letters.
iPhoneKeyboardType.URLLetters with slash and .com buttons. Can be switched to keyboard with numbers and punctuation.
iPhoneKeyboardType.NumberPadOnly numbers from 0 to 9.
iPhoneKeyboardType.PhonePadKeyboard used to enter phone numbers.
iPhoneKeyboardType.NamePhonePadLetters. Can be switched to phone keyboard.
iPhoneKeyboardType.EmailAddressLetters with @ sign. Can be switched to keyboard with numbers and punctuation.

Text Preview

By default, an edit box will be created and placed on top of the keyboard after it appears. This works as preview of the text that user is typing, so the text is always visible for the user. However, you can disable text preview by setting iPhoneKeyboard.hideInput to true. Note that this works only for certain keyboard types and input modes. For example, it will not work for phone keypads and multi-line text input. In such cases, the edit box will always appear. iPhoneKeyboard.hideInput is a global variable and will affect all keyboards.

Keyboard Orientation

By default, the keyboard automatically follows the device orientation. To disable or enable rotation to a certain orientation, use the following properties available in iPhoneKeyboard:

autorotateToPortraitEnable or disable autorotation to portrait orientation (button at the bottom).
autorotateToPortraitUpsideDownEnable or disable autorotation to portrait orientation (button at top).
autorotateToLandscapeLeftEnable or disable autorotation to landscape left orientation (button on the right).
autorotateToLandscapeRightEnable or disable autorotation to landscape right orientation (button on the left).

Visibility and Keyboard Size

There are three keyboard properties in iPhoneKeyboard that determine keyboard visibility status and size on the screen.

visibleReturns true if the keyboard is fully visible on the screen and can be used to enter characters.
areaReturns the position and dimensions of the keyboard.
activeReturns true if the keyboard is activated. This property is not static property. You must have a keyboard instance to use this property.

Note that iPhoneKeyboard.area will return a rect with position and size set to 0 until the keyboard is fully visible on the screen. You should not query this value immediately after iPhoneKeyboard.Open. The sequence of keyboard events is as follows:

  • iPhoneKeyboard.Open is called. iPhoneKeyboard.active returns true. iPhoneKeyboard.visible returns false. iPhoneKeyboard.area returns (0, 0, 0, 0).
  • Keyboard slides out into the screen. All properties remain the same.
  • Keyboard stops sliding. iPhoneKeyboard.active returns true. iPhoneKeyboard.visible returns true. iPhoneKeyboard.area returns real position and size of the keyboard.

Secure Text Input

It is possible to configure the keyboard to hide symbols when typing. This is useful when users are required to enter sensitive information (such as passwords). To manually open keyboard with secure text input enabled, use the following code:

iPhoneKeyboard.Open("", iPhoneKeyboardType.Default, false, false, true);

Hiding text while typing

Alert keyboard

To display the keyboard with a black semi-transparent background instead of the classic opaque, call iPhoneKeyboard.Open as follows:

iPhoneKeyboard.Open("", iPhoneKeyboardType.Default, false, false, true, true);

Classic keyboard

Alert keyboard

Android

Unity Android reuses the iOS API to display system keyboard. Even though Unity Android supports most of the functionality of its iPhone counterpart, there are two aspects which are not supported:

  • iPhoneKeyboard.hideInput
  • iPhoneKeyboard.area

Please also note that the layout of a iPhoneKeyboardType can differ somewhat between devices.

Page last updated: 2011-11-03



iOS-Advanced

iOS

Advanced iOS scripting

Determining Device Generation

Different device generations support different functionality and have widely varying performance. You should query the device's generation and decide which functionality should be disabled to compensate for slower devices.

You can find the device generation from the iPhone.generation property. The reported generation can be one of the following:

  • iPhone
  • iPhone3G
  • iPhone3GS
  • iPhone4
  • iPodTouch1Gen
  • iPodTouch2Gen
  • iPodTouch3Gen
  • iPodTouch4Gen
  • iPad1Gen

You can find more information about different device generations, performance and supported functionality in our iPhone Hardware Guide.

Device Properties

There are a number of device-specific properties that you can access:-

SystemInfo.deviceUniqueIdentifierUnique device identifier.
SystemInfo.deviceNameUser specified name for device.
SystemInfo.deviceModelIs it iPhone or iPod Touch?
SystemInfo.operatingSystemOperating system name and version.

Anti-Piracy Check

Pirates will often hack an application from the AppStore (by removing Apple DRM protection) and then redistribute it for free. Unity iOS comes with an anti-piracy check which allows you to determine if your application was altered after it was submitted to the AppStore.

You can check if your application is genuine (not-hacked) with the Application.genuine property. If this property returns false then you might notify the user that he is using a hacked application or maybe disable access to some functions of your application.

Note: accessing the Application.genuine property is a fairly expensive operation and so you shouldn't do it during frame updates or other time-critical code.

Vibration Support

You can trigger a vibration by calling Handheld.Vibrate. Note that iPod Touch devices lack vibration hardware and will just ignore this call.

Android

Advanced Android scripting

Determining Device Generation

Different Android devices support different functionality and have widely varying performance. You should target specific devices or device families and decide which functionality should be disabled to compensate for slower devices. There are a number of device specific properties that you can access to which device is being used.

Note: Android Marketplace does some additional compatibility filtering, so you should not be concerned if an ARMv7-only app optimised for OGLES2 is offered to some old slow devices.

Device Properties

SystemInfo.deviceUniqueIdentifierUnique device identifier.
SystemInfo.deviceNameUser specified name for device.
SystemInfo.deviceModelIs it iPhone or iPod Touch?
SystemInfo.operatingSystemOperating system name and version.

Anti-Piracy Check

Pirates will often hack an application (by removing Apple DRM protection) and then redistribute it for free. Unity Android comes with an anti-piracy check which allows you to determine if your application was altered after it was submitted to the AppStore.

You can check if your application is genuine (not-hacked) with the Application.genuine property. If this property returns false then you might notify user that he is using a hacked application or maybe disable access to some functions of your application.

Note: Application.genuineCheckAvailable should be used along with Application.genuine to verify that application integrity can actually be confirmed. Accessing the Application.genuine property is a fairly expensive operation and so you shouldn't do it during frame updates or other time-critical code.

Vibration Support

You can trigger a vibration by calling Handheld.Vibrate. However, devices lacking vibration hardware will just ignore this call.

Page last updated: 2012-07-12



iOS-DotNet

iOS

Now Unity iOS supports two .NET API compatibility levels: .NET 2.0 and a subset of .NET 2.0 .You can select the appropriate level in the Player Settings.

.NET API 2.0

Unity supports the .NET 2.0 API profile. This is close to the full .NET 2.0 API and offers the best compatibility with pre-existing .NET code. However, the application's build size and startup time will be relatively poor.

Note: Unity iOS does not support namespaces in scripts. If you have a third party library supplied as source code then the best approach is to compile it to a DLL outside Unity and then drop the DLL file into your project's Assets folder.

.NET 2.0 Subset

Unity also supports the .NET 2.0 Subset API profile. This is close to the Mono "monotouch" profile, so many limitations of the "monotouch" profile also apply to Unity's .NET 2.0 Subset profile. More information on the limitations of the "monotouch" profile can be found here. The advantage of using this profile is reduced build size (and startup time) but this comes at the expense of compatibility with existing .NET code.

Android

Unity Android supports two .NET API compatibility levels: .NET 2.0 and a subset of .NET 2.0 You can select the appropriate level in the Player Settings.

.NET API 2.0

Unity supports the .NET 2.0 API profile; It is close to the full .NET 2.0 API and offers the best compatibility with pre-existing .NET code. However, the application's build size and startup time will be relatively poor.

Note: Unity Android does not support namespaces in scripts. If you have a third party library supplied as source code then the best approach is to compile it to a DLL outside Unity and then drop the DLL file into your project's Assets folder.

.NET 2.0 Subset

Unity also supports the .NET 2.0 Subset API profile. This is close to the Mono "monotouch" profile, so many limitations of the "monotouch" profile also apply to Unity's .NET 2.0 Subset profile. More information on the limitations of the "monotouch" profile can be found here. The advantage of using this profile is reduced build size (and startup time) but this comes at the expense of compatibility with existing .NET code.

Page last updated: 2012-07-12



iphone-Hardware

Hardware models

The following table summarizes iOS hardware available in devices of various generations:

iPhone Models

Original iPhone

  • Screen: 320x480 pixels, LCD at 163ppi
  • ARM11, 412 Mhz CPU

Fixed-function graphics (no fancy shaders), very slow CPU and GPU.

  • PowerVR MBX Lite 3D graphics processor
    • Slow
  • 128MB of memory
  • 2 megapixel camera

iPhone 3G

  • Screen: 320x480 pixels, LCD at 163ppi
  • ARM11, 412 Mhz CPU
  • PowerVR MBX Lite 3D graphics processor
    • Slow
  • 128MB of memory
  • 2 megapixel camera
  • GPS support

iPhone 3GS

  • Screen: 320x480 pixels, LCD at 163ppi
  • ARM Cortex A8, 600 MHz CPU
  • PowerVR SGX535 graphics processor
    • Shader perfomance at native resolution, compared to iPad2:
  • Raw shader perfomance, compared to iPad3:
  • 256MB of memory
  • 3 megapixel camera with video capture capability

Shader-capable hardware, per-pixel-lighting (bumpmaps) can only be on small portions of the screen at once. Requires scripting optimization for complex games. This is the average hardware of the app market as of July 2012

  • GPS support
  • Compass support

iPhone 4

  • Screen: 960x640 pixels, LCD at 326 ppi, 800:1 contrast ratio.
  • Apple A4
    • 1Ghz ARM Cortex-A8 CPU
    • PowerVR SGX535 GPU
      • Shader perfomance at native resolution, compared to iPad2:
  • Raw shader perfomance, compared to iPad3:
  • 512MB of memory
  • Cameras
    • Rear 5.0 MP backside illuminated CMOS image sensor with 720p HD video at 30 fps and LED flash
    • Front 0.3 MP (VGA) with geotagging, tap to focus, and 480p SD video at 30 fps
  • GPS support
  • Compass Support

The iPhone 4S, with the new A5 chip, is capable of rendering complex shaders throughout the entire screen. Even image effects may be possible. However, optimizing your shaders is still crucial. But if your game isn't trying to push limits of the device, optimizing scripting and gameplay is probably as much of a waste of time on this generation of devices as it is on PC.

iPhone 4S

  • Screen: 960x640 pixels, LCD at 326 ppi, 800:1 contrast ratio.
  • Apple A5
    • Dual-Core 1Ghz ARM Cortex-A9 MPCore CPU
    • Dual-Core PowerVR SGX543MP2 GPU
      • Shader perfomance at native resolution, compared to iPad2:
  • Raw shader perfomance, compared to iPad3:
  • 512MB of memory
  • Cameras
    • Rear 5.0 MP backside illuminated CMOS image sensor with 720p HD video at 30 fps and LED flash
    • Front 0.3 MP (VGA) with geotagging, tap to focus, and 480p SD video at 30 fps
  • GPS support
  • Compass Support

iPod Touch Models

Fixed-function graphics (no fancy shaders), very slow CPU and GPU.

iPod Touch 1st generation

  • Screen: 320x480 pixels, LCD at 163ppi
  • ARM11, 412 Mhz CPU
  • PowerVR MBX Lite 3D graphics processor
    • Slow
  • 128MB of memory

iPod Touch 2nd generation

  • Screen: 320x480 pixels, LCD at 163ppi
  • ARM11, 533 Mhz CPU
  • PowerVR MBX Lite 3D graphics processor
    • Slow
  • 128MB of memory
  • Speaker and microphone

Shader-capable hardware, per-pixel-lighting (bumpmaps) can only be on small portions of the screen at once. Requires scripting optimization for complex games. This is the average hardware of the app market as of July 2012

iPod Touch 3rd generation

  • Comparable to iPhone 3GS

iPod Touch 4th generation

  • Comparable to iPhone 4

iPad Models

Similar to iPod Touch 4th Generation and iPhone 4.

iPad

  • Screen: 1024x768 pixels, LCD at 132 ppi, LED-backlit.
  • Apple A4
    • 1Ghz MHz ARM Cortex-A8 CPU
    • PowerVR SGX535 GPU
      • Shader perfomance at native resolution, compared to iPad2:
  • Raw shader perfomance, compared to iPad3:
  • Wifi + Blueooth + (3G Cellular HSDPA, 2G cellular EDGE on the 3G version)
  • Accelerometer, ambient light sensor, magnetometer (for digital compass)
  • Mechanical keys: Home, sleep, screen rotation lock, volume.

The A5 can do full screen bumpmapping, assuming the shader is simple enough. However, it is likely that your game will perform best with bumpmapping only on crucial objects. Full screen image effects still out of reach. Scripting optimization less important.

iPad 2

  • Screen: 1024x768 pixels, LCD at 132 ppi, LED-backlit.
  • Apple A5
    • Dual-Core 1Ghz ARM Cortex-A9 MPCore CPU
    • Dual-Core PowerVR SGX543MP2 GPU
      • Shader perfomance at native resolution, compared to iPad2:
  • Raw shader perfomance, compared to iPad3:
  • Same as Previous

The iPad 3 has been shown to be capable of render-to-texture effects such as reflective water and fullscreen image effects. However, optimized shaders are still crucial. But if your game isn't trying to push limits of the device, optimizing scripting and gameplay is probably as much of a waste of time on this generation of devices as it is on PC.

iPad 3

  • Screen: 2048 1536 pixels, LCD at 264 ppi, LED-backlit.
  • Apple A5X
    • Dual-Core 1Ghz ARM Cortex-A9 MPCore CPU
    • Quad-Core PowerVR SGX543MP4 GPU
      • Shader perfomance at native resolution, compared to iPad2:
  • Raw shader perfomance, compared to iPad3:

Graphics Processing Unit and Hidden Surface Removal

The iPhone/iPad graphics processing unit (GPU) is a Tile-Based Deferred Renderer. In contrast with most GPUs in desktop computers, the iPhone/iPad GPU focuses on minimizing the work required to render an image as early as possible in the processing of a scene. That way, only the visible pixels will consume processing resources.

The GPU's frame buffer is divided up into tiles and rendering happens tile by tile. First, triangles for the whole frame are gathered and assigned to the tiles. Then, visible fragments of each triangle are chosen. Finally, the selected triangle fragments are passed to the rasterizer (triangle fragments occluded from the camera are rejected at this stage).

In other words, the iPhone/iPad GPU implements a Hidden Surface Removal operation at reduced cost. Such an architecture consumes less memory bandwidth, has lower power consumption and utilizes the texture cache better. Tile-Based Deferred Rendering allows the device to reject occluded fragments before actual rasterization, which helps to keep overdraw low.

For more information see also:-

MBX series

Older devices such as the original iPhone, iPhone 3G and iPod Touch 1st and 2nd Generation are equipped with the MBX series of GPUs. The MBX series supports only OpenGL ES1.1, the fixed function Transform/Lighting pipeline and two textures per fragment.

SGX series

Starting with the iPhone 3GS, newer devices are equipped with the SGX series of GPUs. The SGX series features support for the OpenGL ES2.0 rendering API and vertex and pixel shaders. The Fixed-function pipeline is not supported natively on such GPUs, but instead is emulated by generating vertex and pixel shaders with analogous functionality on the fly.

The SGX series fully supports MultiSample anti-aliasing.

Texture Compression

The only texture compression format supported by iOS is PVRTC. PVRTC provides support for RGB and RGBA (color information plus an alpha channel) texture formats and can compress a single pixel to two or four bits.

The PVRTC format is essential to reduce the memory footprint and to reduce consumption of memory bandwidth (ie, the rate at which data can be read from memory, which is usually very limited on mobile devices).

Vertex Processing Unit

The iPhone/iPad has a dedicated unit responsible for vertex processing which runs calculations in parallel with rasterization. In order to achieve better parallelization, the iPhone/iPad processes vertices one frame ahead of the rasterizer.

Unified Memory Architecture

Both the CPU and GPU on the iPhone/iPad share the same memory. The advantage is that you don't need to worry about running out of video memory for your textures (unless, of course, you run out of main memory too). The disadvantage is that you share the same memory bandwidth for gameplay and graphics. The more memory bandwidth you dedicate to graphics, the less you will have for gameplay and physics.

Multimedia CoProcessing Unit

The iPhone/iPad main CPU is equipped with a powerful SIMD (Single Instruction, Multiple Data) coprocessor supporting either the VFP or the NEON architecture. The Unity iOS run-time takes advantage of these units for multiple tasks such as calculating skinned mesh transformations, geometry batching, audio processing and other calculation-intensive operations.

Page last updated: 2012-08-20



iphone-performance

ここではiOSデバイスに特化した最適化に関して記述されます。より詳細のモバイルでバイスの情報は Practical Guide to Optimization for Mobiles をご覧ください。

Page last updated: 2012-11-13



iphone-iOS-Optimization

This page details optimizations which are unique to iOS deployment. For more information on optimizing for mobile devices, see the Practical Guide to Optimization for Mobiles.

Script Call Optimization

Most of the functions in the UnityEngine namespace are implemented in C/C++. Calling a C/C++ function from a Mono script involves a performance overhead. You can use iOS Script Call optimization (menu: Edit->Project Settings->Player) to save about 1 to 4 milliseconds per frame. The options for this setting are:-

Setting the Desired Framerate

Unity iOS allows you to change the frequency with which your application will try to execute its rendering loop, which is set to 30 frames per second by default. You can lower this number to save battery power but of course this saving will come at the expense of frame updates. Conversely, you can increase the framerate to give the rendering priority over other activities such as touch input and accelerometer processing. You will need to experiment with your choice of framerate to determine how it affects gameplay in your case.

If your application involves heavy computation or rendering and can maintain only 15 frames per second, say, then setting the desired frame rate higher than fifteen wouldn't give any extra performance. The application has to be optimized sufficiently to allow for a higher framerate.

To set the desired framerate, open the XCode project generated by Unity and open the AppController.mm file. The line

#define kFPS 30

...determines the the current framerate, so you can just change to set the desired value. For example, if you change the define to:-

#define kFPS 60

...then the application will attempt to render at 60 FPS instead of 30 FPS.

The Rendering Loop

When iOS version 3.1 or later is in use, Unity will use the CADisplayLink class to schedule the rendering loop. Versions before 3.1 need to use one of several fallback methods to handle the loop. However, the fallback methods can be activated even for iOS 3.1 and later by changing the line

#define USE_DISPLAY_LINK_IF_AVAILABLE 1

...and changing it to

#define USE_DISPLAY_LINK_IF_AVAILABLE 0

Fallback Loop Types

Apple recommends the system timer for scheduling the rendering operation on iOS versions before 3.1. This approach is good for applications where performance is not critical and favours battery life and correct processing of events over rendering performance. However, better rendering performance is often more important to games, so Unity provides several scheduling methods to tweak the performance of the rendering loop:-

The different fallback loop types can be selected by changing defines in the AppController.mm file. The significant lines are the following:-

#define FALLBACK_LOOP_TYPE NSTIMER_BASED_LOOP
#define FALLBACK_LOOP_TYPE THREAD_BASED_LOOP
#define FALLBACK_LOOP_TYPE EVENT_PUMP_BASED_LOOP

The file should have all but one of these lines commented out. The uncommented line selects the rendering loop method that will be used by the application.

If you want to prioritize rendering over input processing with the NSTimer approach you should locate and change the line

#define kThrottleFPS 2.0

...in AppController.mm. Increasing this number will give higher priority to rendering. The result of changing this value varies among applications, so it is best to try it for yourself and see what happens in your specific case.

If you use the Event Pump rendering loop then you need to tweak the kMillisecondsPerFrameToProcessEvents constant precisely to achieve the desired responsiveness. The kMillisecondsPerFrameToProcessEvents constant allows you to specify exactly how much time (in milliseconds) you will allow the OS to process events. If you allocate insufficient time for this task then touch or accelerometer events might be lost, and while the application will be fast, it will also be less responsive.

To specify the amount of time (in milliseconds) that the OS will spend processing events, locate and change the line

#define kMillisecondsPerFrameToProcessEvents 7.0

...in AppController.mm.

Tuning Accelerometer Processing Frequency

If accelerometer input is processed too frequently then the overall performance of your game may suffer as a result. By default, a Unity iOS application will sample the accelerometer 60 times per second. You may see some performance benefit by reducing the accelerometer sampling frequency and it can even be set to zero for games that don't use accelerometer input. You can change the accelerometer frequency from the Other Settings panel in the iOS Player Settings.

Page last updated: 2012-07-30



iphone-InternalProfiler

iOS

On iOS, it's disabled by default so to enable it, you need to open the Unity-generated XCode project, select the iPhone_Profiler.h file and change the line

#define ENABLE_INTERNAL_PROFILER 0

to

#define ENABLE_INTERNAL_PROFILER 1

Select Run->Console in the XCode menu to display the output console (GDB) and then run your project. Unity will output statistics to the console window every thirty frames.

Android

On Android, it is enabled by default. Just make sure Development Build is checked in the player settings when building, and the statistics should show up in logcat when run on the device. To view logcat, you need adb or the Android Debug Bridge. Once you have that, simply run the shell command adb logcat.

Here's an example of the built-in profiler's output.

iPhone/iPad Unity internal profiler stats:
cpu-player>    min:  9.8   max: 24.0   avg: 16.3
cpu-ogles-drv> min:  1.8   max:  8.2   avg:  4.3
cpu-waits-gpu> min:  0.8   max:  1.2   avg:  0.9
cpu-present>   min:  1.2   max:  3.9   avg:  1.6
frametime>     min: 31.9   max: 37.8   avg: 34.1
draw-call #>   min:   4    max:   9    avg:   6     | batched:    10
tris #>        min:  3590  max:  4561  avg:  3871   | batched:  3572
verts #>       min:  1940  max:  2487  avg:  2104   | batched:  1900
player-detail> physx:  1.2 animation:  1.2 culling:  0.5 skinning:  0.0 batching:  0.2 render: 12.0 fixed-update-count: 1 .. 2
mono-scripts>  update:  0.5   fixedUpdate:  0.0 coroutines:  0.0 
mono-memory>   used heap: 233472 allocated heap: 548864  max number of collections: 1 collection total duration:  5.7

All times are measured in milliseconds per frame. You can see the minimum, maximum and average times over the last thirty frames.

General CPU Activity

cpu-playerDisplays the time your game spends executing code inside the Unity engine and executing scripts on the CPU.
cpu-ogles-drvDisplays the time spent executing OpenGL ES driver code on the CPU. Many factors like the number of draw calls, number of internal rendering state changes, the rendering pipeline setup and even the number of processed vertices can have an effect on the driver stats.
cpu-waits-gpuDisplays the time the CPU is idle while waiting for the GPU to finish rendering. If this number exceeds 2-3 milliseconds then your application is most probably fillrate/GPU processing bound. If this value is too small then the profile skips displaying the value.
msaa-resolveThe time taken to apply anti-aliasiing.
cpu-presentThe amount of time spent executing the presentRenderbuffer command in OpenGL ES.
frametimeRepresents the overall time of a game frame. Note that iOS hardware is always locked at a 60Hz refresh rate, so you will always get multiples times of ~16.7ms (1000ms/60Hz = ~16.7ms).

Rendering Statistics

draw-call #The number of draw calls per frame. Keep it as low as possible.
tris #Total number of triangles sent for rendering.
verts #Total number of vertices sent for rendering. You should keep this number below 10000 if you use only static geometry but if you have lots of skinned geometry then you should keep it much lower.
batchedNumber of draw-calls, triangles and vertices which were automatically batched by the engine. Comparing these numbers with draw-call and triangle totals will give you an idea how well is your scene prepared for batching. Share as many materials as possible among your objects to improve batching.

Detailed Unity Player Statistics

The player-detail section provides a detailed breakdown of what is happening inside the engine:-

physxTime spent on physics.
animationTime spent animating bones.
cullingTime spent culling objects outside the camera frustum.
skinningTime spent applying animations to skinned meshes.
batchingTime spent batching geometry. Batching dynamic geometry is considerably more expensive than batching static geometry.
renderTime spent rendering visible objects.
fixed-update-countMinimum and maximum number of FixedUpdates executed during this frame. Too many FixedUpdates will deteriorate performance considerably. There are some simple guidelines to set a good value for the fixed time delta here.

Detailed Scripts Statistics

The mono-scripts section provides a detailed breakdown of the time spent executing code in the Mono runtime:

updateTotal time spent executing all Update() functions in scripts.
fixedUpdateTotal time spent executing all FixedUpdate() functions in scripts.
coroutinesTime spent inside script coroutines.

Detailed Statistics on Memory Allocated by Scripts

The mono-memory section gives you an idea of how memory is being managed by the Mono garbage collector:

allocated heapTotal amount of memory available for allocations. A garbage collection will be triggered if there is not enough memory left in the heap for a given allocation. If there is still not enough free memory even after the collection then the allocated heap will grow in size.
used heapThe portion of the allocated heap which is currently used up by objects. Every time you create a new class instance (not a struct) this number will grow until the next garbage collection.
max number of collectionsNumber of garbage collection passes during the last 30 frames.
collection total durationTotal time (in milliseconds) of all garbage collection passes that have happened during the last 30 frames.

Page last updated: 2012-07-29



iphone-playerSizeOptimization

The two main ways of reducing the size of the player are by changing the Active Build Configuration within Xcode and by changing the Stripping Level within Unity.

Building in Release Mode

You can choose between the Debug and Release options on the Active Build Configuration drop-down menu in Xcode. Building as Release instead of Debug can reduce the size of the built player by as much as 2-3MB, depending on the game.


The Active Build Configuration drop-down

In Release mode, the player will be built without any debug information, so if your game crashes or has other problems there will be no stack trace information available for output. This is fine for deploying a finished game but you will probably want to use Debug mode during development.

iOS Stripping Level (Advanced License feature)

The size optimizations activated by stripping work in the following way:-

  1. Strip assemblies level: the scripts' bytecode is analyzed so that classes and methods that are not referenced from the scripts can be removed from the DLLs and thereby excluded from the AOT compilation phase. This optimization reduces the size of the main binary and accompanying DLLs and is safe as long as no reflection is used.
  2. Strip ByteCode level: any .NET DLLs (stored in the Data folder) are stripped down to metadata only. This is possible because all the code is already precompiled during the AOT phase and linked into the main binary.
  3. Use micro mscorlib level: a special, smaller version of mscorlib is used. Some components are removed from this library, for example, Security, Reflection.Emit, Remoting, non Gregorian calendars, etc. Also, interdependencies between internal components are minimized. This optimization reduces the main binary and mscorlib.dll size but it is not compatible with some System and System.Xml assembly classes, so use it with care.

These levels are cumulative, so level 3 optimization implicitly includes levels 2 and 1, while level 2 optimization includes level 1.

Note: Micro mscorlib is a heavily stripped-down version of the core library. Only those items that are required by the Mono runtime in Unity remain. Best practice for using micro mscorlib is not to use any classes or other features of .NET that are not required by your application. GUIDs are a good example of something you could omit; they can easily be replaced with custom made pseudo GUIDs and doing this would result in better performance and app size.

Tips

How to Deal with Stripping when Using Reflection

Stripping depends highly on static code analysis and sometimes this can't be done effectively, especially when dynamic features like reflection are used. In such cases, it is necessary to give some hints as to which classes shouldn't be touched. Unity supports a per-project custom stripping blacklist. Using the blacklist is a simple matter of creating a link.xml file and placing it into the Assets folder. An example of the contents of the link.xml file follows. Classes marked for preservation will not be affected by stripping:-

<linker>
       <assembly fullname="System.Web.Services">
               <type fullname="System.Web.Services.Protocols.SoapTypeStubInfo" preserve="all"/>
               <type fullname="System.Web.Services.Configuration.WebServicesConfigurationSectionHandler" preserve="all"/>
       </assembly>

       <assembly fullname="System">
               <type fullname="System.Net.Configuration.WebRequestModuleHandler" preserve="all"/>
               <type fullname="System.Net.HttpRequestCreator" preserve="all"/>
               <type fullname="System.Net.FileWebRequestCreator" preserve="all"/>
       </assembly>
</linker>

Note: it can sometimes be difficult to determine which classes are getting stripped in error even though the application requires them. You can often get useful information about this by running the stripped application on the simulator and checking the Xcode console for error messages.

Simple Checklist for Making Your Distribution as Small as Possible

  1. Minimize your assets: enable PVRTC compression for textures and reduce their resolution as far as possible. Also, minimize the number of uncompressed sounds. There are some additional tips for file size reduction here.
  2. Set the iOS Stripping Level to Use micro mscorlib.
  3. Set the script call optimization level to Fast but no exceptions.
  4. Don't use anything that lives in System.dll or System.Xml.dll in your code. These libraries are not compatible with micro mscorlib.
  5. Remove unnecessary code dependencies.
  6. Set the API Compatibility Level to .Net 2.0 subset. Note that .Net 2.0 subset has limited compatibility with other libraries.
  7. Set the Target Platform to armv6 (OpenGL ES1.1).
  8. Don't use JS Arrays.
  9. Avoid generic containers in combination with value types, including structs.

Can I produce apps of less than 20 megabytes with Unity?

Yes. An empty project would take about 13 MB in the AppStore if all the size optimizations were turned off. This gives you a budget of about 7MB for compressed assets in your game. If you own an Advanced License (and therefore have access to the stripping option), the empty scene with just the main camera can be reduced to about 6 MB in the AppStore (zipped and DRM attached) and you will have about 14 MB available for compressed assets.

Why did my app increase in size after being released to the AppStore?

When they publish your app, Apple first encrypt the binary file and then compresses it via zip. Most often Apple's DRM increases the binary size by about 4 MB or so. As a general rule, you should expect the final size to be approximately equal to the size of the zip-compressed archive of all files (except the executable) plus the size of the uncompressed executable file.

Page last updated: 2011-11-08



iphone-accountsetup

There are some steps you must follow before you can build and run any code (including Unity-built games) on your iOs device. These steps are prerequisite to publishing your own iOS games.

1. Apply to Apple to Become a Registered iPhone/iPad Developer

You do this through Apple's website: http://developer.apple.com/iphone/program/

2. Upgrade your Operating System and iTunes Installation

Please note that these are Apple's requirements as part of using the iPhone SDK, but the requirements can change from time to time.

3. Download the iPhone SDK

Download the latest iOS SDK from the iOS dev center and install it. Do not download the beta version of the SDK - you should use only the latest shipping version. Note that downloading and installing the iPhone SDK will also install XCode.

4. Get Your Device Identifier

Connect your iOS device to the Mac with the USB cable and launch XCode. XCode will detect your phone as a new device and you should register it with the "Use For Development" button. This will usually open the Organizer window but if it doesn't then go to Window->Organizer. You should see your iOS device) in the devices list on the left; select it and note your device's identifier code (which is about 40 characters long).

5. Add Your Device

Log in to the iPhone developer center and enter the program portal (button on the right). Go to the Devices page via the link on left side and then click the Add Device button on the right. Enter a name for your device (alphanumeric characters only) and your device's identifier code (noted in step 5 above). Click the Submit button when done.

6. Create a Certificate

From the iPhone Developer Program Portal, click the Certificates link on the left side and follow the instructions listed under How-To...

7. Download and Install the WWDR Intermediate Certificate

The download link is in the same "Certificates" section (just above the "Important Notice" rubric) as WWDR Intermediate Certificate. Once downloaded, double-click the certificate file to install it.

8. Create a Provisioning File

Provisioning profiles are a bit complex, and need to be set up according to the way you have organized your team. It is difficult to give general instructions for provisioning, so we recommend that you look at the Provisioning How-to section on the Apple Developer website.

Page last updated: 2011-11-08



iphone-unsupported

グラフィックス

オーディオ

スクリプティング

Unity iOS Advanced License に制限された機能

注意: .NET CIL コードの 1MB は、ARM コードの約 3〜4MB に変換されるので、外部ライブラリへの参照は最小限にすることをお勧めします。 例えば、アプリケーションが System.dll と System.Xml.dll を参照する場合、ストリッピングが使用されていない場合、さらに ARM コードが 6MB 必要になります。 ある点では、リンカがコードのリンクに問題がある場合、アプリケーションが制限に達します。 アプリケーションのサイズを重視する場合、JavaScript に比べ依存性が低いため、C# の方がコードには適しているかもしれません。

Page last updated: 2012-11-13



iphone-Plugins

This page describes Native Code Plugins for the iOS platform.

Building an Application with a Native Plugin for iOS

  1. Define your extern method in the C# file as follows:
    [DllImport ("__Internal")]
    private static extern float FooPluginFunction ();
  2. Set the editor to the iOS build target
  3. Add your native code source files to the generated XCode project's "Classes" folder (this folder is not overwritten when the project is updated, but don't forget to backup your native code).

If you are using C++ (.cpp) or Objective-C (.mm) to implement the plugin you must ensure the functions are declared with C linkage to avoid name mangling issues.

extern "C" {
  float FooPluginFunction ();
} 

Using Your Plugin from C#

iOS native plugins can be called only when deployed on the actual device, so it is recommended to wrap all native code methods with an additional C# code layer. This code should check Application.platform and call native methods only when the app is running on the device; dummy values can be returned when the app runs in the Editor. See the Bonjour browser sample application for an example.

Calling C# / JavaScript back from native code

Unity iOS supports limited native-to-managed callback functionality via UnitySendMessage:
UnitySendMessage("GameObjectName1", "MethodName1", "Message to send");

This function has three parameters : the name of the target GameObject, the script method to call on that object and the message string to pass to the called method.

Known limitations:

  1. Only script methods that correspond to the following signature can be called from native code: function MethodName(message:string)
  2. Calls to UnitySendMessage are asynchronous and have a delay of one frame.

Automated plugin integration

Unity iOS supports automated plugin integration in a limited way. All files with extensions .a,.m,.mm,.c,.cpp located in the Assets/Plugins/iOS folder will be merged into the generated Xcode project automatically. However, merging is done by symlinking files from Assets/Plugins/iOS to the final destination, which might affect some workflows. The .h files are not included in the Xcode project tree, but they appear on the destination file system, thus allowing compilation of .m/.mm/.c/.cpp files.

Note: subfolders are currently not supported.

iOS Tips

  1. Managed-to-unmanaged calls are quite processor intensive on iOS. Try to avoid calling multiple native methods per frame.
  2. As mentioned above, wrap your native methods with an additional C# layer that calls native code on the device and returns dummy values in the Editor.
  3. String values returned from a native method should be UTF-8 encoded and allocated on the heap. Mono marshaling calls are free for strings like this.
  4. As mentioned above, the XCode project's "Classes" folder is a good place to store your native code because it is not overwritten when the project is updated.
  5. Another good place for storing native code is the Assets folder or one of its subfolders. Just add references from the XCode project to the native code files: right click on the "Classes" subfolder and choose "Add->Existing files...".

Examples

Bonjour Browser Sample

A simple example of the use of a native code plugin can be found here

This sample demonstrates how objective-C code can be invoked from a Unity iOS application. This application implements a very simple Bonjour client. The application consists of a Unity iOS project (Plugins/Bonjour.cs is the C# interface to the native code, while BonjourTest.js is the JS script that implements the application logic) and native code (Assets/Code) that should be added to the built XCode project.

Page last updated: 2011-11-01



iphone-Downloadable-Content

This chapter does not aim to cover how to integrate your game with Apple's "StoreKit" API. It is assumed that you already have integration with "StoreKit" via a native code plugin.

Apple's "StoreKit" documentation defines four kinds of Products that could be sold via the "In App Purchase" process:

This chapter covers the first case only and focuses mainly on the downloadable content concept. AssetBundles are ideal candidates for use as downloadable content, and two scenarios will be covered:

Exporting your assets for use on iOS

Having separate projects for downloadable content can be a good idea, allowing better separation between content that comes with your main application and content that is downloaded later.

Please note: Any game scripts included in downloadable content must also be present in the main executable.

  1. Create an Editor folder inside the Project View.
  2. Create an ExportBundle.js script there and place the following code inside:
    @MenuItem ("Assets/Build AssetBundle From Selection - Track dependencies")
    static function ExportBundle(){
    
            var str : String = EditorUtility.SaveFilePanel("Save Bundle...", Application.dataPath, Selection.activeObject.name, "assetbundle");
            if (str.Length != 0){
                 BuildPipeline.BuildAssetBundle(Selection.activeObject, Selection.objects, str, BuildAssetBundleOptions.CompleteAssets, BuildTarget.iPhone);
            }
    }
    
  3. Design your objects that need to be downloadable as prefabs
  4. Select a prefab that needs to be exported and mouse right click

    If the first two steps were done properly, then the Build AssetBundle From Selection - Track dependencies context menu item should be visible.
  5. Select it if you want to include everything that this asset uses.
  6. A save dialog will be shown, enter the desired asset bundle file name. An .assetbundle extension will be added automatically. The Unity iOS runtime accepts only asset bundles built with the same version of the Unity editor as the final application. Read BuildPipeline.BuildAssetBundle for details.

Downloading your assets on iOS

  1. Asset bundles can be downloaded and loaded by using the WWW class and instantiating a main asset. Code sample:
    	var download : WWW;
    
    	var url = "http://somehost/somepath/someassetbundle.assetbundle";
    
    	download = new WWW (url);
    
    	yield download;
    
    	assetBundle = download.assetBundle;
    
    	if (assetBundle != null) {
    		// Alternatively you can also load an asset by name (assetBundle.Load("my asset name"))
    		var go : Object = assetBundle.mainAsset;
    
    		if (go != null)
    			instanced = Instantiate(go);
    		else
    			Debug.Log("Couldnt load resource");	
    	} else {
    		Debug.Log("Couldnt load resource");	
    	}
    
  2. You can save required files to a Documents folder next to your game's Data folder.
            public static string GetiPhoneDocumentsPath () { 
                    // Your game has read+write access to /var/mobile/Applications/XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/Documents 
                    // Application.dataPath returns              
                    // /var/mobile/Applications/XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/myappname.app/Data 
                    // Strip "/Data" from path 
                    string path = Application.dataPath.Substring (0, Application.dataPath.Length - 5); 
                    // Strip application name 
                    path = path.Substring(0, path.LastIndexOf('/'));  
                    return path + "/Documents"; 
            }
    
  3. Cache a downloaded asset bundle using the .NET file API and for reuse it in the future by loading it via WWW class and file:///pathtoyourapplication/Documents/savedassetbundle.assetbundle. Sample code for caching:
    	// Code designed for caching on iPhone, cachedAssetBundle path must be different when running in Editor
    	// See code snippet above for getting the path to your Documents folder
    	private var cachedAssetBundle : String = "path to your Documents folder" + "/savedassetbundle.assetbundle"; 
    	var cache = new System.IO.FileStream(cachedAssetBundle, System.IO.FileMode.Create);
    	cache.Write(download.bytes, 0, download.bytes.Length);
    	cache.Close();
    	Debug.Log("Cache saved: " + cachedAssetBundle);
    
Note: You can test reading files from the Documents folder if you enable file sharing. Setting UIFileSharingEnabled to true in your Info.plist allows you to access the Documents folder from iTunes.

Page last updated: 2011-11-16



MobileCustomizeSplashScreen

iOS

Under iOS Basic, a default splash screen will be displayed while your game loads, oriented according to the Default Screen Orientation option in the Player Settings.

Users with an iOS Pro license can use any texture in the project as a splash screen. The size of the texture depends on the target device (320x480 pixels for 1-3rd gen devices, 1024x768 for iPad, 640x960 for 4th gen devices) and supplied textures will be scaled to fit if necessary. You can set the splash screen textures using the iOS Player Settings.

Android

Under Android Basic, a default splash screen will be displayed while your game loads, oriented according to the Default Screen Orientation option in the Player Settings.

Android Pro users can use any texture in the project as a splash screen. You can set the texture from the Splash Image section of the Android Player Settings. You should also select the Splash scaling method from the following options:-

  • Center (only scale down) will draw your image at its natural size unless it is too large, in which case it will be scaled down to fit.
  • Scale to fit (letter-boxed) will draw your image so that the longer dimension fits the screen size exactly. Empty space around the sides in the shorter dimension will be filled in black.
  • Scale to fill (cropped) will scale your image so that the shorter dimension fits the screen size exactly. The image will be cropped in the longer dimension.

Page last updated: 2011-11-08



iphone-troubleshooting

This section addresses common problems that can arise when using Unity. Each platform is dealt with separately below.

Desktop

In MonoDevelop, the Debug button is greyed out!

  • This means that MonoDevelop was unable to find the Unity executable. In the MonoDevelop preferences, go to the Unity/Debugger section and then browse to where your Unity executable is located.

Is there a way to get rid of the welcome page in MonoDevelop?

  • Yes. In the MonoDevelop preferences, go to the Visual Style section, and uncheck "Load welcome page on startup".

Geforce 7300GT on OSX 10.6.4

  • Deferred rendering is disabled because materials are not displayed correctly for Geforce 7300GT on OX 10.6.4; This happens because of buggy video drivers.

On Windows x64, Unity crashes when my script throws a NullReferenceException

Graphics

Slow framerate and/or visual artifacts.

  • This may occur if your video card drivers are not up to date. Make sure you have the latest official drivers from your card vendor.

Shadows

I see no shadows at all!

  • Shadows are a Unity Pro only feature, so without Unity Pro you won't get shadows. Simpler shadow methods, like using a Projector, are still possible, of course.
  • Shadows also require certain graphics hardware support. See Shadows page for details.
  • Check if shadows are not completely disabled in Quality Settings.
  • Shadows are currently not supported for Android and iOS mobile platforms.

Some of my objects do not cast or receive shadows

An object's Renderer must have Receive Shadows enabled for shadows to be rendered onto it. Also, an object must have Cast Shadows enabled in order to cast shadows on other objects (both are on by default).

Only opaque objects cast and receive shadows. This means that objects using the built-in Transparent or Particle shaders will not cast shadows. In most cases it is possible to use Transparent Cutout shaders for objects like fences, vegetation, etc. If you use custom written Shaders, they have to be pixel-lit and use the Geometry render queue. Objects using VertexLit shaders do not receive shadows but are able to cast them.

Only Pixel lights cast shadows. If you want to make sure that a light always casts shadows no matter how many other lights are in the scene, then you can set it to Force Pixel render mode (see the Light reference page).

iOS

Troubleshooting on iOS devices

There are some situations with iOS where your game can work perfectly in the Unity editor but then doesn't work or maybe doesn't even start on the actual device. The problems are often related to code or content quality. This section describes the most common scenarios.

The game stops responding after a while. Xcode shows "interrupted" in the status bar.

There are a number of reasons why this may happen. Typical causes include:

  1. Scripting errors such as using uninitialized variables, etc.
  2. Using 3rd party Thumb compiled native libraries. Such libraries trigger a known problem in the iOS SDK linker and might cause random crashes.
  3. Using generic types with value types as parameters (eg, List<int>, List<SomeStruct>, List<SomeEnum>, etc) for serializable script properties.
  4. Using reflection when managed code stripping is enabled.
  5. Errors in the native plugin interface (the managed code method signature does not match the native code function signature).

Information from the XCode Debugger console can often help detect these problems (Xcode menu: View > Debug Area > Activate Console).

The Xcode console shows "Program received signal: “SIGBUS” or EXC_BAD_ACCESS error.

This message typically appears on iOS devices when your application receives a NullReferenceException. There two ways to figure out where the fault happened:

Managed stack traces

Since version 3.4 Unity includes software-based handling of the NullReferenceException. The AOT compiler includes quick checks for null references each time a method or variable is accessed on an object. This feature affects script performance which is why it is enabled only for development builds (for basic license users it is enough to enable the "development build" option in the Build Settings dialog, while iOS pro license users additionally need to enable the "script debugging" option). If everything was done right and the fault actually is occurring in .NET code then you won't see EXC_BAD_ACCESS anymore. Instead, the .NET exception text will be printed in the Xcode console (or else your code will just handle it in a "catch" statement). Typical output might be:

Unhandled Exception: System.NullReferenceException: A null value was found where an object instance was required.
  at DayController+$handleTimeOfDay$121+$.MoveNext () [0x0035a] in DayController.js:122 

This indicates that the fault happened in the handleTimeOfDay method of the DayController class, which works as a coroutine. Also if it is script code then you will generally be told the exact line number (eg, "DayController.js:122"). The offending line might be something like the following:

 Instantiate(_imgwww.assetBundle.mainAsset);

This might happen if, say, the script accesses an asset bundle without first checking that it was downloaded correctly.

Native stack traces

Native stack traces are a much more powerful tool for fault investigation but using them requires some expertise. Also, you generally can't continue after these native (hardware memory access) faults happen. To get a native stack trace, type bt all into the Xcode Debugger Console. Carefully inspect the printed stack traces - they may contain hints about where the error occurred. You might see something like:

...
Thread 1 (thread 11523): 
#0 0x006267d0 in m_OptionsMenu_Start () 
#1 0x002e4160 in wrapper_runtime_invoke_object_runtime_invoke_void__this___object_intptr_intptr_intptr () 
#2 0x00a1dd64 in mono_jit_runtime_invoke (method=0x18b63bc, obj=0x5d10cb0, params=0x0, exc=0x2fffdd34) at /Users/mantasp/work/unity/unity-mono/External/Mono/mono/mono/mini/mini.c:4487
#3 0x0088481c in MonoBehaviour::InvokeMethodOrCoroutineChecked ()
...

First of all you should find the stack trace for "Thread 1", which is the main thread. The very first lines of the stack trace will point to the place where the error occurred. In this example, the trace indicates that the NullReferenceException happened inside the "OptionsMenu" script's "Start" method. Looking carefully at this method implementation would reveal the cause of the problem. Typically, NullReferenceExceptions happen inside the Start method when incorrect assumptions are made about initialization order. In some cases only a partial stack trace is seen on the Debugger Console:

Thread 1 (thread 11523): 
#0 0x0062564c in start ()

This indicates that native symbols were stripped during the Release build of the application. The full stack trace can be obtained with the following procedure:

  • Remove application from device.
  • Clean all targets.
  • Build and run.
  • Get stack traces again as described above.

EXC_BAD_ACCESS starts occurring when an external library is linked to the Unity iOS application.

This usually happens when an external library is compiled with the ARM Thumb instruction set. Currently such libraries are not compatible with Unity. The problem can be solved easily by recompiling the library without Thumb instructions. You can do this for the library's Xcode project with the following steps:

  • in Xcode, select "View" > "Navigators" > "Show Project Navigator" from the menu
  • select the "Unity-iPhone" project, activate "Build Settings" tab
  • in the search field enter : "Other C Flags"
  • add -mno-thumb flag there and rebuild the library.

If the library source is not available you should ask the supplier for a non-thumb version of the library.

The Xcode console shows "WARNING -> applicationDidReceiveMemoryWarning()" and the application crashes immediately afterwards

(Sometimes you might see a message like Program received signal: 0.) This warning message is often not fatal and merely indicates that iOS is low on memory and is asking applications to free up some memory. Typically, background processes like Mail will free some memory and your application can continue to run. However, if your application continues to use memory or ask for more, the OS will eventually start killing applications and yours could be one of them. Apple does not document what memory usage is safe, but empirical observations show that applications using less than 50% MB of all device RAM (like ~200-256 MB for 2nd generation ipad) do not have major memory usage problems. The main metric you should rely on is how much RAM your application uses. Your application memory usage consists of three major components:

  • application code (the OS needs to load and keep your application code in RAM, but some of it might be discarded if really needed)
  • native heap (used by the engine to store its state, your assets, etc. in RAM)
  • managed heap (used by your Mono runtime to keep C# or JavaScript objects)
  • GLES driver memory pools: textures, framebuffers, compiled shaders, etc.

Your application memory usage can be tracked by two Xcode Instruments tools: Activity Monitor, Object Allocations and VM Tracker. You can start from the Xcode Run menu: Product > Profile and then select specific tool. Activity Monitor tool shows all process statistics including Real memory which can be regarded as the total amount of RAM used by your application. Note: OS and device HW version combination might noticeably affect memory usage numbers, so you should be careful when comparing numbers obtained on different devices.

Note: The internal profiler shows only the heap allocated by .NET scripts. Total memory usage can be determined via Xcode Instruments as shown above. This figure includes parts of the application binary, some standard framework buffers, Unity engine internal state buffers, the .NET runtime heap (number printed by internal profiler), GLES driver heap and some other miscellaneous stuff.

The other tool displays all allocations made by your application and includes both native heap and managed heap statistics (don't forget to check the Created and still living box to get the current state of the application). The important statistic is the Net bytes value.

To keep memory usage low:

  • Reduce the application binary size by using the strongest iOS stripping options (Advanced license feature), and avoid unnecessary dependencies on different .NET libraries. See the player settings and player size optimization manual pages for further details.
  • Reduce the size of your content. Use PVRTC compression for textures and use low poly models. See the manual page about reducing file size for more information.
  • Don't allocate more memory than necessary in your scripts. Track mono heap size and usage with the internal profiler
  • Note: with Unity 3.0, the scene loading implementation has changed significantly and now all scene assets are preloaded. This results in fewer hiccups when instantiating game objects. If you need more fine-grained control of asset loading and unloading during gameplay, you should use Resources.Load and Object.Destroy.

Querying the OS about the amount of free memory may seem like a good idea to evaluate how well your application is performing. However, the free memory statistic is likely to be unreliable since the OS uses a lot of dynamic buffers and caches. The only reliable approach is to keep track of memory consumption for your application and use that as the main metric. Pay attention to how the graphs from the tools described above change over time, especially after loading new levels.

The game runs correctly when launched from Xcode but crashes while loading the first level when launched manually on the device.

There could be several reasons for this. You need to inspect the device logs to get more details. Connect the device to your Mac, launch Xcode and select Window > Organizer from the menu. Select your device in the Organizer's left toolbar, then click on the "Console" tab and review the latest messages carefully. Additionally, you may need to investigate crash reports. You can find out how to obtain crash reports here: http://developer.apple.com/iphone/library/technotes/tn2008/tn2151.html.

The Xcode Organizer console contains the message "killed by SpringBoard".

There is a poorly-documented time limit for an iOS application to render its first frames and process input. If your application exceeds this limit, it will be killed by SpringBoard. This may happen in an application with a first scene which is too large, for example. To avoid this problem, it is advisable to create a small initial scene which just displays a splash screen, waits a frame or two with yield and then starts loading the real scene. This can be done with code as simple as the following:

 
function Start () {
    yield;
    Application.LoadLevel("Test");
}

Type.GetProperty() / Type.GetValue() cause crashes on the device

Currently Type.GetProperty() and Type.GetValue() are supported only for the .NET 2.0 Subset profile. You can select the .NET API compatibility level in the Player Settings.

Note: Type.GetProperty() and Type.GetValue() might be incompatible with managed code stripping and might need to be excluded (you can supply a custom non-strippable type list during the stripping process to accomplish this). For further details, see the iOS player size optimization guide.

The game crashes with the error message "ExecutionEngineException: Attempting to JIT compile method 'SometType`1<SomeValueType>:.ctor ()' while running with --aot-only."

The Mono .NET implementation for iOS is based on AOT (ahead of time compilation to native code) technology, which has its limitations. It compiles only those generic type methods (where a value type is used as a generic parameter) which are explicitly used by other code. When such methods are used only via reflection or from native code (ie, the serialization system) then they get skipped during AOT compilation. The AOT compiler can be hinted to include code by adding a dummy method somewhere in the script code. This can refer to the missing methods and so get them compiled ahead of time.

void _unusedMethod()
{
    var tmp = new SomeType<SomeValueType>();
}

Note: value types are basic types, enums and structs.

Various crashes occur on the device when a combination of System.Security.Cryptography and managed code stripping is used

.NET Cryptography services rely heavily on reflection and so are not compatible with managed code stripping since this involves static code analysis. Sometimes the easiest solution to the crashes is to exclude the whole System.Security.Crypography namespace from the stripping process.

The stripping process can be customized by adding a custom link.xml file to the Assets folder of your Unity project. This specifies which types and namespaces should be excluded from stripping. Further details can be found in the iOS player size optimization guide.

link.xml

<linker>
       <assembly fullname="mscorlib">
               <namespace fullname="System.Security.Cryptography" preserve="all"/>
       </assembly>
</linker>

Application crashes when using System.Security.Cryptography.MD5 with managed code stripping

You might consider advice listed above or can work around this problem by adding extra reference to specific class to your script code:

object obj = new MD5CryptoServiceProvider();

"Ran out of trampolines of type 1/2" runtime error

This error usually happens if you use lots of recursive generics. You can hint to the AOT compiler to allocate more trampolines of type 1 or type 2. Additional AOT compiler command line options can be specified in the "Other Settings" section of the Player Settings. For type 1 trampolines, specify nrgctx-trampolines=ABCD, where ABCD is the number of new trampolines required (i.e. 4096). For type 2 trampolines specify nimt-trampolines=ABCD.

After upgrading Xcode Unity iOS runtime fails with message "You are using Unity iPhone Basic. You are not allowed to remove the Unity splash screen from your game"

With some latest Xcode releases there were changes introduced in PNG compression and optimization tool. These changes might cause false positives in Unity iOS runtime checks for splash screen modifications. If you encounter such problems try upgrading Unity to the latest publicly available version. If it does not help you might consider following workaround:

  • Replace your Xcode project from scratch when building from Unity (instead of appending it)
  • Delete already installed project from device
  • Clean project in Xcode (Product->Clean)
  • Clear Xcode's Derived Data folders (Xcode->Preferences->Locations)

If this still does not help try disabling PNG re-compression in Xcode:

  • Open your Xcode project
  • Select "Unity-iPhone" project there
  • Select "Build Settings" tab there
  • Look for "Compress PNG files" option and set it to NO

App Store submission fails with "iPhone/iPod Touch: application executable is missing a required architecture. At least one of the following architecture(s) must be present: armv6" message

You might get such message when updating already existing application, which previously was submitted with armv6 support. Unity 4.x and Xcode 4.5 does not support armv6 platform anymore. To solve submission problem just set Target OS Version in Unity Player Settings to 4.3 or higher.

WWW downloads are working fine in Unity Editor and on Android, but not on iOS

Most common mistake is to assume that WWW downloads are always happening on separate thread. On some platforms this might be true, but you should not take it for granted. Best way to track WWW status is either to use yield statement or check status in Update method. You should not use busy while loops for that.

"PlayerLoop called recursively!" error occurs when using Cocoa via a native function called from a script

Some operations with the UI will result in iOS redrawing the window immediately (the most common example is adding a UIView with a UIViewController to the main UIWindow). If you call a native function from a script, it will happen inside Unity's PlayerLoop, resulting in PlayerLoop being called recursively. In such cases, you should consider using the performSelectorOnMainThread method with waitUntilDone set to false. It will inform iOS to schedule the operation to run between Unity's PlayerLoop calls.

Profiler or Debugger unable to see game running on iOS device

  • Check that you have built a Development build, and ticked the "Enable Script Debugging" and "Autoconnect profiler" boxes (as appropriate).
  • The application running on the device will make a multicast broadcast to 225.0.0.222 on UDP port 54997. Check that your network settings allow this traffic. Then, the profiler will make a connection to the remote device on a port in the range 55000 - 55511 to fetch profiler data from the device. These ports will need to be open for UDP access.

Missing DLLs

If your application runs ok in editor but you get errors in your iOS project this may be caused by missing DLLs (e.g. I18N.dll, I19N.West.dll). In this case, try copying those dlls from within the Unity.app to your project's Assets/Plugins folder. The location of the DLLs within the unity app is:

 Unity.app/Contents/Frameworks/Mono/lib/mono/unity 

You should then also check the stripping level of your project to ensure the classes in the DLLs aren't being removed when the build is optimised. Refer to the iOS Optimisation Page for more information on iOS Stripping Levels.

Xcode Debugger console reports: ExecutionEngineException: Attempting to JIT compile method '(wrapper native-to-managed) Test:TestFunc (int)' while running with --aot-only

Typically such message is received when managed function delegate is passed to the native function, but required wrapper code wasn't generated when building application. You can help AOT compiler by hinting which methods will be passed as delegates to the native code. This can be done by adding "MonoPInvokeCallbackAttribute" custom attribute. Currently only static methods can be passed as delegates to the native code.

Sample code:

using UnityEngine;
using System.Collections;
using System;
using System.Runtime.InteropServices;
using AOT;

public class NewBehaviourScript : MonoBehaviour {

	[DllImport ("__Internal")]
	private static extern void DoSomething (NoParamDelegate del1, StringParamDelegate del2);

	delegate void NoParamDelegate ();
	delegate void StringParamDelegate (string str);

	[MonoPInvokeCallback (typeof (NoParamDelegate))]
	public static void NoParamCallback()
	{
		Debug.Log ("Hello from NoParamCallback");
	}

	[MonoPInvokeCallback (typeof (StringParamDelegate))]
	public static void StringParamCallback(string str)
	{
		Debug.Log (string.Format ("Hello from StringParamCallback {0}", str));
	}

	// Use this for initialization
	void Start () {
		DoSomething(NoParamCallback, StringParamCallback);
	}
}

Android

Troubleshooting Android development

Unity fails to install your application to your device

  1. Verify that your computer can actually see and communicate with the device. See the Publishing Builds page for further details.
  2. Check the error message in the Unity console. This will often help diagnose the problem.

If you get an error saying "Unable to install APK, protocol failure" during a build then this indicates that the device is connected to a low-power USB port (perhaps a port on a keyboard or other peripheral). If this happens, try connecting the device to a USB port on the computer itself.

Your application crashes immediately after launch.

  1. Ensure that you are not trying to use NativeActivity with devices that do not support it.
  2. Try removing any native plugins you have.
  3. Try disabling stripping.
  4. Use adb logcat to get the crash report from your device.

Building DEX Failed

This an error which will produce a message like the following:-

Building DEX Failed!
G:\Unity\JavaPluginSample\Temp/StagingArea> java -Xmx1024M 
-Djava.ext.dirs="G:/AndroidSDK/android-sdk_r09-windows\platform-tools/lib/" 
-jar "G:/AndroidSDK/android-sdk_r09-windows\platform-tools/lib/dx.jar" 
--dex --verbose --output=bin/classes.dex bin/classes.jar plugins
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.

This is usually caused by having the wrong version of Java installed on your machine. Updating your Java installation to the latest version will generally solve this issue.

The game crashes after a couple of seconds when playing video

Make sure Settings->Developer Options->Don't keep activities isn't enabled on the phone. The video player is its own activity and therefore the regular game activity will be destroyed if the video player is activated.

My game quits when I press the sleep button

Change the <activity> tag in the AndroidManifest.xml to contain <android:configChanges> tag as described here.

An example activity tag might look something like this:-

<activity android:name=".AdMobTestActivity"
                  android:label="@string/app_name"
                  android:configChanges="fontScale|keyboard|keyboardHidden|locale|mnc|mcc|navigation|orientation|screenLayout|screenSize|smallestScreenSize|uiMode|touchscreen">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />
                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>

Page last updated: 2012-11-28



iphone-bugreporting

Before submitting a bug report, please check the iOS Troubleshooting page, where you will find solutions to common crashes and other problems.

If your application crashes in the Xcode debugger then you can add valuable information to your bug report as follows:-

  1. Click Continue (Run->Continue) twice
  2. Open the debugger console (Run->Console) and enter (in the console): thread apply all bt
  3. Copy all console output and send it together with your bugreport.

If your application crashes on the iOS device then you should retrieve the crash report as described here on Apple's website. Please attach the crash report, your built application and console log to your bug report before submitting.

Page last updated: 2011-11-09



android-GettingStarted

Building games for a device running Android OS requires an approach similar to that for iOS development. However, the hardware is not completely standardized across all devices, and this raises issues that don't occur in iOS development. There are some feature differences in the Android version of Unity just as there are with the iOS version.

Setting up your Android Developer environment

You will need to have your Android developer environment set up before you can test your Unity games on the device. This involves downloading and installing the Android SDK with the different Android plaforms and adding your physical device to your system (this is done a bit differently depending on whether you are developing on Windows or Mac). This setup process is explained on the Android developer website, and there may be additional information provided by the manufacturer of your device. Since this is a complex process, we've provided a basic outline of the tasks that must be completed before you can run code on your Android device or in the Android emulator. However, the best thing to do is follow the instructions step-by-step from the Android developer portal.

Access Android Functionality

Unity Android provides scripting APIs to access various input data and settings. You can find out more about the available classes on the Android scripting page.

Exposing Native C, C++ or Java Code to Scripts

Unity Android allows you to call custom functions written in C/C++ directly from C# scripts (Java functions can be called indirectly). To find out how to make functions from native code accessible from Unity, visit the plugins page.

Occlusion Culling

Unity includes support for occlusion culling which is a particularly valuable optimization on mobile platforms. More information can be found on the occlusion culling page.

Splash Screen Customization

The splash screen displayed while the game launches can be customized - see this page for further details.

Troubleshooting and Bug Reports

There are many reasons why your application may crash or fail to work as you expected. Our Android troubleshooting guide will help you get to the bottom of bugs as quickly as possible. If, after consulting the guide, you suspect the problem is internal to Unity then you should file a bug report - see this page for details on how to do this.

How Unity Android Differs from Desktop Unity

Strongly Typed JavaScript

For performance reasons, dynamic typing in JavaScript is always turned off in Unity Android, as if #pragma strict were applied automatically to all scripts. This is important to know if you start with a project originally developed for the desktop platforms since you may find you get unexpected compile errors when switching to Android; dynamic typing is the first thing to investigate. These errors are usually easy to fix if you make sure all variables are explicitly typed or use type inference on initialization.

ETC as Recommended Texture Compression

Although Unity Android does support DXT/PVRTC/ATC textures, Unity will decompress the textures into RGB(A) format at runtime if those compression methods are not supported by the particular device in use. This could have an impact on the GPU rendering speed and it is recommended to use the ETC format instead. ETC is the de facto standard compression format on Android, and should be supported on all post 2.0 devices. However, ETC does not support an alpha channel and RGBA 16-bit will sometimes be the best trade-off between size, quality and rendering speed where alpha is required.

It is also possible to create separate android distribution archives (.apk) for each of the DXT/PVRTC/ATC formats, and let the Android Market's filtering system select the correct archives for different devices (see Publishing Builds for Android).

Movie Playback

Movie textures are not supported on Android, but a full-screen streaming playback is provided via scripting functions. To learn about supported file formats and scripting API, consult the movie page or the Android supported media formats page.

Further Reading

Page last updated: 2011-11-23



android-sdksetup

There are some steps you must follow before you can build and run any code on your Android device. This is true regardless of whether you use Unity or write Android applications from scratch.

1. Download the Android SDK

Go to the Android Developer SDK webpage. Download and unpack the latest Android SDK.

2. Installing the Android SDK

Follow the instructions under Installing the SDK (although you can freely skip the optional parts relating to Eclipse). In step 4 of Installing the SDK be sure to add at least one Android platform with API level equal to or higher than 9 (Platform 2.3 or greater), the Platform Tools, and the USB drivers if you're using Windows.

3. Get the device recognized by your system

This can be tricky, especially under Windows based systems where drivers tend to be a problem. Also, your device may come with additional information or specific drivers from the manufacturer.

Note: Don't forget to turn on "USB Debugging" on your device. You can do this from the home screen: press MENU, select Applications > Development, then enable USB debugging.

If you are unsure whether your device is properly installed on your system, please read the trouble-shooting page for details.

4. Add the Android SDK path to Unity

The first time you build a project for Android (or if Unity later fails to locate the SDK) you will be asked to locate the folder where you installed the Android SDK (you should select the root folder of the SDK installation). The location of the Android SDK can also be changed in the editor by selecting Unity > Preferences from the menu and then clicking on External Tools in the preferences window.

Page last updated: 2012-03-24



android-remote

Android Remote is a Android application that makes your device act as a remote control for the project in Unity. This is useful for rapid development when you don't want to compile and deploy your project to device for each change.

How to use Android remote

To use Android Remote, you should firstly make sure that you have the latest Android SDK installed (this is necessary to set up port-forwarding on the device). Then, connect the device to your computer with a USB cable and launch the Android Remote app. When you press Play in the Unity editor, the device will act as a remote control and will pass accelerometer and touch input events to the running game.

Page last updated: 2011-11-24



android-troubleshooting

This section addresses common problems that can arise when using Unity. Each platform is dealt with separately below.

Desktop

In MonoDevelop, the Debug button is greyed out!

  • This means that MonoDevelop was unable to find the Unity executable. In the MonoDevelop preferences, go to the Unity/Debugger section and then browse to where your Unity executable is located.

Is there a way to get rid of the welcome page in MonoDevelop?

  • Yes. In the MonoDevelop preferences, go to the Visual Style section, and uncheck "Load welcome page on startup".

Geforce 7300GT on OSX 10.6.4

  • Deferred rendering is disabled because materials are not displayed correctly for Geforce 7300GT on OX 10.6.4; This happens because of buggy video drivers.

On Windows x64, Unity crashes when my script throws a NullReferenceException

Graphics

Slow framerate and/or visual artifacts.

  • This may occur if your video card drivers are not up to date. Make sure you have the latest official drivers from your card vendor.

Shadows

I see no shadows at all!

  • Shadows are a Unity Pro only feature, so without Unity Pro you won't get shadows. Simpler shadow methods, like using a Projector, are still possible, of course.
  • Shadows also require certain graphics hardware support. See Shadows page for details.
  • Check if shadows are not completely disabled in Quality Settings.
  • Shadows are currently not supported for Android and iOS mobile platforms.

Some of my objects do not cast or receive shadows

An object's Renderer must have Receive Shadows enabled for shadows to be rendered onto it. Also, an object must have Cast Shadows enabled in order to cast shadows on other objects (both are on by default).

Only opaque objects cast and receive shadows. This means that objects using the built-in Transparent or Particle shaders will not cast shadows. In most cases it is possible to use Transparent Cutout shaders for objects like fences, vegetation, etc. If you use custom written Shaders, they have to be pixel-lit and use the Geometry render queue. Objects using VertexLit shaders do not receive shadows but are able to cast them.

Only Pixel lights cast shadows. If you want to make sure that a light always casts shadows no matter how many other lights are in the scene, then you can set it to Force Pixel render mode (see the Light reference page).

iOS

Troubleshooting on iOS devices

There are some situations with iOS where your game can work perfectly in the Unity editor but then doesn't work or maybe doesn't even start on the actual device. The problems are often related to code or content quality. This section describes the most common scenarios.

The game stops responding after a while. Xcode shows "interrupted" in the status bar.

There are a number of reasons why this may happen. Typical causes include:

  1. Scripting errors such as using uninitialized variables, etc.
  2. Using 3rd party Thumb compiled native libraries. Such libraries trigger a known problem in the iOS SDK linker and might cause random crashes.
  3. Using generic types with value types as parameters (eg, List<int>, List<SomeStruct>, List<SomeEnum>, etc) for serializable script properties.
  4. Using reflection when managed code stripping is enabled.
  5. Errors in the native plugin interface (the managed code method signature does not match the native code function signature).

Information from the XCode Debugger console can often help detect these problems (Xcode menu: View > Debug Area > Activate Console).

The Xcode console shows "Program received signal: “SIGBUS” or EXC_BAD_ACCESS error.

This message typically appears on iOS devices when your application receives a NullReferenceException. There two ways to figure out where the fault happened:

Managed stack traces

Since version 3.4 Unity includes software-based handling of the NullReferenceException. The AOT compiler includes quick checks for null references each time a method or variable is accessed on an object. This feature affects script performance which is why it is enabled only for development builds (for basic license users it is enough to enable the "development build" option in the Build Settings dialog, while iOS pro license users additionally need to enable the "script debugging" option). If everything was done right and the fault actually is occurring in .NET code then you won't see EXC_BAD_ACCESS anymore. Instead, the .NET exception text will be printed in the Xcode console (or else your code will just handle it in a "catch" statement). Typical output might be:

Unhandled Exception: System.NullReferenceException: A null value was found where an object instance was required.
  at DayController+$handleTimeOfDay$121+$.MoveNext () [0x0035a] in DayController.js:122 

This indicates that the fault happened in the handleTimeOfDay method of the DayController class, which works as a coroutine. Also if it is script code then you will generally be told the exact line number (eg, "DayController.js:122"). The offending line might be something like the following:

 Instantiate(_imgwww.assetBundle.mainAsset);

This might happen if, say, the script accesses an asset bundle without first checking that it was downloaded correctly.

Native stack traces

Native stack traces are a much more powerful tool for fault investigation but using them requires some expertise. Also, you generally can't continue after these native (hardware memory access) faults happen. To get a native stack trace, type bt all into the Xcode Debugger Console. Carefully inspect the printed stack traces - they may contain hints about where the error occurred. You might see something like:

...
Thread 1 (thread 11523): 
#0 0x006267d0 in m_OptionsMenu_Start () 
#1 0x002e4160 in wrapper_runtime_invoke_object_runtime_invoke_void__this___object_intptr_intptr_intptr () 
#2 0x00a1dd64 in mono_jit_runtime_invoke (method=0x18b63bc, obj=0x5d10cb0, params=0x0, exc=0x2fffdd34) at /Users/mantasp/work/unity/unity-mono/External/Mono/mono/mono/mini/mini.c:4487
#3 0x0088481c in MonoBehaviour::InvokeMethodOrCoroutineChecked ()
...

First of all you should find the stack trace for "Thread 1", which is the main thread. The very first lines of the stack trace will point to the place where the error occurred. In this example, the trace indicates that the NullReferenceException happened inside the "OptionsMenu" script's "Start" method. Looking carefully at this method implementation would reveal the cause of the problem. Typically, NullReferenceExceptions happen inside the Start method when incorrect assumptions are made about initialization order. In some cases only a partial stack trace is seen on the Debugger Console:

Thread 1 (thread 11523): 
#0 0x0062564c in start ()

This indicates that native symbols were stripped during the Release build of the application. The full stack trace can be obtained with the following procedure:

  • Remove application from device.
  • Clean all targets.
  • Build and run.
  • Get stack traces again as described above.

EXC_BAD_ACCESS starts occurring when an external library is linked to the Unity iOS application.

This usually happens when an external library is compiled with the ARM Thumb instruction set. Currently such libraries are not compatible with Unity. The problem can be solved easily by recompiling the library without Thumb instructions. You can do this for the library's Xcode project with the following steps:

  • in Xcode, select "View" > "Navigators" > "Show Project Navigator" from the menu
  • select the "Unity-iPhone" project, activate "Build Settings" tab
  • in the search field enter : "Other C Flags"
  • add -mno-thumb flag there and rebuild the library.

If the library source is not available you should ask the supplier for a non-thumb version of the library.

The Xcode console shows "WARNING -> applicationDidReceiveMemoryWarning()" and the application crashes immediately afterwards

(Sometimes you might see a message like Program received signal: 0.) This warning message is often not fatal and merely indicates that iOS is low on memory and is asking applications to free up some memory. Typically, background processes like Mail will free some memory and your application can continue to run. However, if your application continues to use memory or ask for more, the OS will eventually start killing applications and yours could be one of them. Apple does not document what memory usage is safe, but empirical observations show that applications using less than 50% MB of all device RAM (like ~200-256 MB for 2nd generation ipad) do not have major memory usage problems. The main metric you should rely on is how much RAM your application uses. Your application memory usage consists of three major components:

  • application code (the OS needs to load and keep your application code in RAM, but some of it might be discarded if really needed)
  • native heap (used by the engine to store its state, your assets, etc. in RAM)
  • managed heap (used by your Mono runtime to keep C# or JavaScript objects)
  • GLES driver memory pools: textures, framebuffers, compiled shaders, etc.

Your application memory usage can be tracked by two Xcode Instruments tools: Activity Monitor, Object Allocations and VM Tracker. You can start from the Xcode Run menu: Product > Profile and then select specific tool. Activity Monitor tool shows all process statistics including Real memory which can be regarded as the total amount of RAM used by your application. Note: OS and device HW version combination might noticeably affect memory usage numbers, so you should be careful when comparing numbers obtained on different devices.

Note: The internal profiler shows only the heap allocated by .NET scripts. Total memory usage can be determined via Xcode Instruments as shown above. This figure includes parts of the application binary, some standard framework buffers, Unity engine internal state buffers, the .NET runtime heap (number printed by internal profiler), GLES driver heap and some other miscellaneous stuff.

The other tool displays all allocations made by your application and includes both native heap and managed heap statistics (don't forget to check the Created and still living box to get the current state of the application). The important statistic is the Net bytes value.

To keep memory usage low:

  • Reduce the application binary size by using the strongest iOS stripping options (Advanced license feature), and avoid unnecessary dependencies on different .NET libraries. See the player settings and player size optimization manual pages for further details.
  • Reduce the size of your content. Use PVRTC compression for textures and use low poly models. See the manual page about reducing file size for more information.
  • Don't allocate more memory than necessary in your scripts. Track mono heap size and usage with the internal profiler
  • Note: with Unity 3.0, the scene loading implementation has changed significantly and now all scene assets are preloaded. This results in fewer hiccups when instantiating game objects. If you need more fine-grained control of asset loading and unloading during gameplay, you should use Resources.Load and Object.Destroy.

Querying the OS about the amount of free memory may seem like a good idea to evaluate how well your application is performing. However, the free memory statistic is likely to be unreliable since the OS uses a lot of dynamic buffers and caches. The only reliable approach is to keep track of memory consumption for your application and use that as the main metric. Pay attention to how the graphs from the tools described above change over time, especially after loading new levels.

The game runs correctly when launched from Xcode but crashes while loading the first level when launched manually on the device.

There could be several reasons for this. You need to inspect the device logs to get more details. Connect the device to your Mac, launch Xcode and select Window > Organizer from the menu. Select your device in the Organizer's left toolbar, then click on the "Console" tab and review the latest messages carefully. Additionally, you may need to investigate crash reports. You can find out how to obtain crash reports here: http://developer.apple.com/iphone/library/technotes/tn2008/tn2151.html.

The Xcode Organizer console contains the message "killed by SpringBoard".

There is a poorly-documented time limit for an iOS application to render its first frames and process input. If your application exceeds this limit, it will be killed by SpringBoard. This may happen in an application with a first scene which is too large, for example. To avoid this problem, it is advisable to create a small initial scene which just displays a splash screen, waits a frame or two with yield and then starts loading the real scene. This can be done with code as simple as the following:

 
function Start () {
    yield;
    Application.LoadLevel("Test");
}

Type.GetProperty() / Type.GetValue() cause crashes on the device

Currently Type.GetProperty() and Type.GetValue() are supported only for the .NET 2.0 Subset profile. You can select the .NET API compatibility level in the Player Settings.

Note: Type.GetProperty() and Type.GetValue() might be incompatible with managed code stripping and might need to be excluded (you can supply a custom non-strippable type list during the stripping process to accomplish this). For further details, see the iOS player size optimization guide.

The game crashes with the error message "ExecutionEngineException: Attempting to JIT compile method 'SometType`1<SomeValueType>:.ctor ()' while running with --aot-only."

The Mono .NET implementation for iOS is based on AOT (ahead of time compilation to native code) technology, which has its limitations. It compiles only those generic type methods (where a value type is used as a generic parameter) which are explicitly used by other code. When such methods are used only via reflection or from native code (ie, the serialization system) then they get skipped during AOT compilation. The AOT compiler can be hinted to include code by adding a dummy method somewhere in the script code. This can refer to the missing methods and so get them compiled ahead of time.

void _unusedMethod()
{
    var tmp = new SomeType<SomeValueType>();
}

Note: value types are basic types, enums and structs.

Various crashes occur on the device when a combination of System.Security.Cryptography and managed code stripping is used

.NET Cryptography services rely heavily on reflection and so are not compatible with managed code stripping since this involves static code analysis. Sometimes the easiest solution to the crashes is to exclude the whole System.Security.Crypography namespace from the stripping process.

The stripping process can be customized by adding a custom link.xml file to the Assets folder of your Unity project. This specifies which types and namespaces should be excluded from stripping. Further details can be found in the iOS player size optimization guide.

link.xml

<linker>
       <assembly fullname="mscorlib">
               <namespace fullname="System.Security.Cryptography" preserve="all"/>
       </assembly>
</linker>

Application crashes when using System.Security.Cryptography.MD5 with managed code stripping

You might consider advice listed above or can work around this problem by adding extra reference to specific class to your script code:

object obj = new MD5CryptoServiceProvider();

"Ran out of trampolines of type 1/2" runtime error

This error usually happens if you use lots of recursive generics. You can hint to the AOT compiler to allocate more trampolines of type 1 or type 2. Additional AOT compiler command line options can be specified in the "Other Settings" section of the Player Settings. For type 1 trampolines, specify nrgctx-trampolines=ABCD, where ABCD is the number of new trampolines required (i.e. 4096). For type 2 trampolines specify nimt-trampolines=ABCD.

After upgrading Xcode Unity iOS runtime fails with message "You are using Unity iPhone Basic. You are not allowed to remove the Unity splash screen from your game"

With some latest Xcode releases there were changes introduced in PNG compression and optimization tool. These changes might cause false positives in Unity iOS runtime checks for splash screen modifications. If you encounter such problems try upgrading Unity to the latest publicly available version. If it does not help you might consider following workaround:

  • Replace your Xcode project from scratch when building from Unity (instead of appending it)
  • Delete already installed project from device
  • Clean project in Xcode (Product->Clean)
  • Clear Xcode's Derived Data folders (Xcode->Preferences->Locations)

If this still does not help try disabling PNG re-compression in Xcode:

  • Open your Xcode project
  • Select "Unity-iPhone" project there
  • Select "Build Settings" tab there
  • Look for "Compress PNG files" option and set it to NO

App Store submission fails with "iPhone/iPod Touch: application executable is missing a required architecture. At least one of the following architecture(s) must be present: armv6" message

You might get such message when updating already existing application, which previously was submitted with armv6 support. Unity 4.x and Xcode 4.5 does not support armv6 platform anymore. To solve submission problem just set Target OS Version in Unity Player Settings to 4.3 or higher.

WWW downloads are working fine in Unity Editor and on Android, but not on iOS

Most common mistake is to assume that WWW downloads are always happening on separate thread. On some platforms this might be true, but you should not take it for granted. Best way to track WWW status is either to use yield statement or check status in Update method. You should not use busy while loops for that.

"PlayerLoop called recursively!" error occurs when using Cocoa via a native function called from a script

Some operations with the UI will result in iOS redrawing the window immediately (the most common example is adding a UIView with a UIViewController to the main UIWindow). If you call a native function from a script, it will happen inside Unity's PlayerLoop, resulting in PlayerLoop being called recursively. In such cases, you should consider using the performSelectorOnMainThread method with waitUntilDone set to false. It will inform iOS to schedule the operation to run between Unity's PlayerLoop calls.

Profiler or Debugger unable to see game running on iOS device

  • Check that you have built a Development build, and ticked the "Enable Script Debugging" and "Autoconnect profiler" boxes (as appropriate).
  • The application running on the device will make a multicast broadcast to 225.0.0.222 on UDP port 54997. Check that your network settings allow this traffic. Then, the profiler will make a connection to the remote device on a port in the range 55000 - 55511 to fetch profiler data from the device. These ports will need to be open for UDP access.

Missing DLLs

If your application runs ok in editor but you get errors in your iOS project this may be caused by missing DLLs (e.g. I18N.dll, I19N.West.dll). In this case, try copying those dlls from within the Unity.app to your project's Assets/Plugins folder. The location of the DLLs within the unity app is:

 Unity.app/Contents/Frameworks/Mono/lib/mono/unity 

You should then also check the stripping level of your project to ensure the classes in the DLLs aren't being removed when the build is optimised. Refer to the iOS Optimisation Page for more information on iOS Stripping Levels.

Xcode Debugger console reports: ExecutionEngineException: Attempting to JIT compile method '(wrapper native-to-managed) Test:TestFunc (int)' while running with --aot-only

Typically such message is received when managed function delegate is passed to the native function, but required wrapper code wasn't generated when building application. You can help AOT compiler by hinting which methods will be passed as delegates to the native code. This can be done by adding "MonoPInvokeCallbackAttribute" custom attribute. Currently only static methods can be passed as delegates to the native code.

Sample code:

using UnityEngine;
using System.Collections;
using System;
using System.Runtime.InteropServices;
using AOT;

public class NewBehaviourScript : MonoBehaviour {

	[DllImport ("__Internal")]
	private static extern void DoSomething (NoParamDelegate del1, StringParamDelegate del2);

	delegate void NoParamDelegate ();
	delegate void StringParamDelegate (string str);

	[MonoPInvokeCallback (typeof (NoParamDelegate))]
	public static void NoParamCallback()
	{
		Debug.Log ("Hello from NoParamCallback");
	}

	[MonoPInvokeCallback (typeof (StringParamDelegate))]
	public static void StringParamCallback(string str)
	{
		Debug.Log (string.Format ("Hello from StringParamCallback {0}", str));
	}

	// Use this for initialization
	void Start () {
		DoSomething(NoParamCallback, StringParamCallback);
	}
}

Android

Troubleshooting Android development

Unity fails to install your application to your device

  1. Verify that your computer can actually see and communicate with the device. See the Publishing Builds page for further details.
  2. Check the error message in the Unity console. This will often help diagnose the problem.

If you get an error saying "Unable to install APK, protocol failure" during a build then this indicates that the device is connected to a low-power USB port (perhaps a port on a keyboard or other peripheral). If this happens, try connecting the device to a USB port on the computer itself.

Your application crashes immediately after launch.

  1. Ensure that you are not trying to use NativeActivity with devices that do not support it.
  2. Try removing any native plugins you have.
  3. Try disabling stripping.
  4. Use adb logcat to get the crash report from your device.

Building DEX Failed

This an error which will produce a message like the following:-

Building DEX Failed!
G:\Unity\JavaPluginSample\Temp/StagingArea> java -Xmx1024M 
-Djava.ext.dirs="G:/AndroidSDK/android-sdk_r09-windows\platform-tools/lib/" 
-jar "G:/AndroidSDK/android-sdk_r09-windows\platform-tools/lib/dx.jar" 
--dex --verbose --output=bin/classes.dex bin/classes.jar plugins
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.

This is usually caused by having the wrong version of Java installed on your machine. Updating your Java installation to the latest version will generally solve this issue.

The game crashes after a couple of seconds when playing video

Make sure Settings->Developer Options->Don't keep activities isn't enabled on the phone. The video player is its own activity and therefore the regular game activity will be destroyed if the video player is activated.

My game quits when I press the sleep button

Change the <activity> tag in the AndroidManifest.xml to contain <android:configChanges> tag as described here.

An example activity tag might look something like this:-

<activity android:name=".AdMobTestActivity"
                  android:label="@string/app_name"
                  android:configChanges="fontScale|keyboard|keyboardHidden|locale|mnc|mcc|navigation|orientation|screenLayout|screenSize|smallestScreenSize|uiMode|touchscreen">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />
                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>

Page last updated: 2012-11-28



android-bugreporting

Before submitting a bug with just "it crashes" in the message body, please look through the Troubleshooting Android development page first.

At this point there are no advanced debug tools to investigate on-device app crashes. However you can use adb application (found under Android-SDK/platform-tools) with logcat parameter. It prints status reports from your device. These reports may include information related to the occurred crash.

If you are sure that the crash you're experiencing happens due to a bug in Unity software, please save the adb logcat output, conduct a repro project and use the bugreporter (Help/Report a bug) to inform us about it. We will get back to you as soon as we can.

Page last updated: 2011-02-25



android-unsupported

Graphics

Scripting

Page last updated: 2012-10-09



android-OBBsupport

This is the deal with .apks and .obbs, in Unity 4.0 and in general.

    Everything else (all additional scenes, resources, streaming assets etc) are placed in the .obb.

This is what's in Unity 4.0. The Split App option is not the only way to split an .apk into .apk/.obb (you can use 3rd party plugins/asset bundles/whatever), but it's the only automatic splitting officially supported.

Now to the downloading of the .obb.

Finally:

Page last updated: 2012-11-14



Android Player Settings

Unityで構築するゲームの最終版のために様々なパラメータを(プラットフォーム固有の)定義する場所がPlayer Settings(プレイヤー設定)です。例えば、これらの値の一部は、スタンドアロンゲームを開いたときに起動するResolution Dialogで使用されているものもあれば、XcodeでiOSデバイス用のゲームを構築する際に使用されるものもあるので、それらを正しく記入することが重要です。

Player Settingsを見るにはメニューでEdit->Project Settings->Playerを選択します。


作成した全てのプロジェクトに適用されるグローバル設定'
Cross-Platform Properties
Company Name会社の名前設定ファイルのロケーションとして使用
Product Nameゲーム実行時にメニューバーに表示される名前であり、設定ファイルのロケーションとして使用。
Default Icon全てのプラットフォームでアプリケーションが使用するデフォルトのアイコン(プラットフォーム固有のニーズに合わせて後でこれを上書きすることができます)
Default Cursorサポートされるすべてのプラットフォーム上でアプリケーションが使用するデフォルトのカーソル。
Cursor Hotspotカーソルのホットスポット位置をデフォルトのカーソルの左上隅からピクセル単位で指定

Per-Platform Settings

Desktop

Web Player

Resolution And Presentation(解像度およびスクリーン)

Resolution
Default Screen Width生成されるWeb Playerのスクリーン幅
Default Screen Height生成されるWeb Playerのスクリーン高さ
Run in backgroundWeb Playerがフォーカスを失った場合もゲームの実行を止めたくない場合にチェック
WebPlayer Template詳細については"Using WebPlayer templates page" をチェックする必要があります、本項では各々の内臓、カスタムのテンプレートについては、アイコンで示しています。

Icon(アイコン)

アイコンはWebPlayerビルドでは無効(Player Settingsの各ネイティブ クライアント ビルドのセクションでアイコンを設定できます)

Other Settings

Rendering
Rendering Pathこのプロパティは、スタンドアロンおよびWebPlayerコンテンツ間で共有されます。
Vertex Litライティング再現性はもっとも低く、シャドウはサポートしていない。古いマシンや限られたモバイルプラットフォーム上での使用に最適。
Forward with Shadersライティング機能のサポートは良い、シャドウはサポートは限定的
Deferred Lightingライティングとシャドウ機能は、最良のサポートを提供するものの、ハードウェアで一定水準のサポートが必要。多くのリアルタイムのライトがある場合に最適。Unity Proのみ。
Color Spaceレンダリングに使用する色空間
GammaSpace Renderingレンダリングのガンマ補正
Linear Rendering Hardware Samplingレンダリングを線形空間で実行
Use Direct3D 11レンダリングにDirect3D 11を使用。
Static BatchingビルドでStatic Batchを使用する場合に設定(WebPlayerではデフォルトで無効))。Unity Proのみ。
Dynamic BatchingビルドでDynamic Batchを使用する場合に設定(デフォルトで有効) 。
Streaming
First Streamed LevelStreamed Web Playerを公開する場合、Resources.Loadアセットにアクセスできる最初のレベルのインデックス
Configuration
Scripting Define Symbolsカスタム コンパイル フラグ(詳細はplatform dependent compilationのページ を参照)。
Optimization
Optimize Mesh Dataメッシュについて、それが適用されたマテリアルが必要としていないデータを全て取り除く(接線、法線、色、UV)

Standalone(スタンドアロン)

''Resolution And Presentation'(解像度およびスクリーン)'

Resolution
Default Screen Widthスタンドアロン ゲームがデフォルトで使用するスクリーン幅。
Default Screen HeightPlayerががデフォルトで使用するスクリーンの高さ。
Run in backgroundフォーカスを失った場合もゲームの実行を止めたくない場合にチェック。
Standalone Player Options
Default is Full Screenゲーム起動時にデフォルトでフルスクリーンモードとしたい場合、これをチェックします。
Capture Single Screenオンにした場合、スタンドアロンゲームはフルスクリーンモードのゲームはマルチモニターの設定において2つめのモニタを暗くしません。
DisplayResolution Dialog
Disabledゲーム開始時にResolution (解像度)ダイアログを表示しない。
Enabledゲーム開始時にResolution (解像度)ダイアログを表示。
Hidden by defaultResolution(解像度)Playerを、ゲーム起動時に "Alt"キーを押した場合のみ表示。
Use Player Logデバッグ情報を含むログを書きこむ。Mac App Storeに申請を提出する予定がある場合は、このオプションのチェック解除しておきます。デフォルトはチェックがオンになってます。
Resizable WindowスタンドアロンのFlash Playerウィンドウサイズをユーザーで変更できるようにします。
Mac App Store ValidationMacのApp StoreのReceipt Validationを有効化。
Mac Fullscreen ModeMacビルドでフルスクリーンモードのオプション。
Capture DisplayUnityがディスプレイ全体をコントロール(すなわち、他のアプリでGUIが表示されず、ユーザーはフルスクリーンモードを終了するまでアプリを切り替えることはできません)。
Fullscreen WindowUnityが画面全体を覆うデスクトップの解像度でウィンドウ実行します。他のアプリのGUIは正しく表示され、OSX 10.7以上でCmd +Tabまたはトラックパッドのジェスチャーでアプリを切り替えることが可能。
Fullscreen Window with Menu Bar and Dockフルスクリーンウィンドウモードと同様だが、標準のメニューバーとDockにも表示。
Supported Aspect RatiosResolutionダイアログで選択できるアスペクト比は、このリストで有効なアイテムであり、モニターがサポートする解像度になります。

Icon(アイコン)

Override for Standaloneスタンドアロンのゲームに使用したいカスタムアイコンを割り当る場合はオンにします。異なるサイズのアイコンを、以下の箱の中に収めます。

Splash Image(スプラッシュ画像)

Config Dialog Bannerゲーム開始時に表示されるカスタムのスプラッシュ画像を追加。

''他の設定

Rendering
Rendering Pathこのプロパティは、スタンドアロンおよびWebPlayerコンテンツ間で共有されます。
Vertex Litライティング再現性はもっとも低く、シャドウはサポートしていない。古いマシンや限られたモバイルプラットフォーム上での使用に最適。
Forward with Shadersライティング機能のサポートは良い、シャドウのサポートは限定的

ライティングとシャドウ機能は、最良のサポートを提供するものの、ハードウェアで一定水準のサポートが必要。 多くのリアルタイムのライトがある場合に最適。Unity Proのみ。||

Color Spaceレンダリングに使用する色空間。
GammaSpace Renderingレンダリングのガンマ補正。
Linear Rendering Hardware Samplingレンダリングを線形空間で実行
Static BatchingビルドでStatic Batchを使用する場合に設定(WebPlayerではデフォルトで無効))。Unity Proのみ。
Dynamic BatchingビルドでDynamic Batchを使用する場合に設定(デフォルトで有効) 。
Configuration
Scripting Define Symbolsカスタム コンパイル フラグ(詳細はplatform dependent compilationのページ を参照)。
Optimization
API Compatibility Level
.Net 2.0.Net 2.0ライブラリ。.Net互換性が最大、ファイルサイズ最大。
.Net 2.0 Subset.NET互換性は全体の一部、ファイルサイズは小さく。
Optimize Mesh Dataメッシュについて、それが適用されたマテリアルが必要としていないデータを全て取り除く(接線、法線、色、UV)

iOS

Resolution And Presentation(解像度およびスクリーン)

Resolution
Default Orientation(このプロパティは、iOSおよびAndroid間で共有されます。)
PortraitデバイスはPortraitモード、持ち方は縦向きでホームボタンが下。
Portrait Upside DownデバイスはPortraitモードで逆さま、持ち方は縦向きでホームボタンが上。
Landscape RightデバイスはLandscapeモード、持ち方は横向きでホームボタンが
Landscape LeftデバイスはLandscapeモード、持ち方は横向きでホームボタンが
Auto Rotation画面の向きが自動的に物理デバイスの向きに基づいて設定されます。
Auto Rotation settings
Use Animated Autorotationチェックがオンの場合、向きの変更が適用される。Default orientationが Auto Rotation.の場合のみ適用。
Auto Rotationで許容される解像度
PortraitオンのときはPortraitモードを許可。Default OrientationがAuto Rotationに設定されている場合にのみ適用。
Portrait Upside DownオンのときはPortraitモード(逆さま)を許可。デフォルトの向きはDefault OrientationがAuto Rotationに設定されている場合にのみ適用。
Landscape RightオンのときはLandscapeモード(ホームボタンが)を許可。Default orientationが Auto Rotation.の場合のみ適用。
Landscape LeftオンのときはLandscapeモード(ホームボタンが)を許可。Default OrientationがAuto Rotationに設定されている場合にのみ適用。
Status Bar
Status Bar Hiddenアプリケーションの起動時にステータスバーが最初に隠されているかどうかを指定します。
Status Bar Styleアプリケーションの起動時の、ステータスバーのスタイルを指定します
Default
Black Translucent
Black Opaque
Use 32-bit Display Buffer32ビットカラー値を保持するためにディスプレイバッファを作成するよう指定(デフォルトでは16ビット)。使用するのは、バンディングが見られる場合や、ImageEffects(画像効果)でアルファを必要とする場合であり、理由はディスプレイバッファと同じ形式にRTを作成するためです。
Show Loading Indicatorローディング インジケータのオプション
Don't Showインジケータなし。
White Large白色で大きいインジケータを表示。
White白色で通常の大きさのインジケータを表示。
Gray灰色で通常サイズのインジケータを表示。

Icon(アイコン)

Override for iOSiPhone / iPadのゲームに使用したいカスタムアイコンを割り当てる場合オンにします。異なるサイズのアイコンを、以下の箱の中に収めます。
Prerendered iconオフの場合iOSはアプリケーションアイコンに光沢やベベル効果を適用します。

Splash Image(スプラッシュ画像)

Mobile Splash Screen (Unity Proのみ)iOSのスプラッシュ画面に使用されるべきテクスチャを指定。標準のスプラッシュ画面サイズは320×480になります。(このプロパティは、iOSおよびAndroid間で共有されます。)| |
High Res. iPhone (Unity Proのみ)iOSの第四世代デバイスのスプラッシュ画面に使用されるべきテクスチャを指定。スプラッシュ画面サイズは640x960。
iPad Portrait (Unity Proのみ)iPadの縦向きのスプラッシュ画面として使用されるべきテクスチャーを指定。標準のスプラッシュ画面サイズは768x1024。
High Res. iPad PortraitiPadの縦向きの高解像度スプラッシュ画面として使用されるべきテクスチャーを指定。標準のスプラッシュ画面サイズは1536x2048。
iPad Landscape (Unity Proのみ)iPadの横向きのスプラッシュ画面として使用されるべきテクスチャーを指定。標準のスプラッシュ画面サイズは1024x768。
High res. iPad Landscape (Unity Proのみ)iPad横向きの高解像度スプラッシュ画面として使用されるべきであるテクスチャーを指定。標準のスプラッシュ画面サイズは2048×1536。

Other Settings(他の設定)

レンダリング

Static BatchingビルドでStatic Batchを使用する場合に設定(デフォルトで有効))。Unity Proのみ。
Dynamic BatchingビルドでDynamic Batchを使用する場合に設定(デフォルトで有効)。
Identification
Bundle Identifierお使いのApple Developer Networkのアカウントからプロビジョニング証明書で使用される文字列(これはiOSとAndroidの間で共有されます)。
Bundle Versionバンドルのビルドバージョン番号を指定、ビルドバージョンが上がったことを示す(リリースされたかどうかにかかわらず)。単調に増加する、ピリオドで区切られた一つ以上の数字。
Configuration
Target Deviceアプリケーションのターゲット デバイスの種類を指定。
iPhone Onlyアプリケーションのターゲット デバイスをiPhoneのみとします。
iPad Onlyアプリケーションのターゲット デバイスをiPadのみとします。
iPhone + iPadアプリケーションのターゲット デバイスをiPadおよびiPhoneとします。
Target Resolutionデプロイしたデバイスで使用したい解像度。(この設定は、480×320の最大解像度を持つデバイスには何の影響もありません。
Native(Default Device Resolution)デバイスのネイティブ解像度を使用します。
Auto (Best Performance)解像度を自動選択し、グラフィック品質より性能を重視。
Auto (Best Quality)解像度を自動選択し、性能よりグラフィック品質を重視。
320p (iPhone)Retina以前 iPhone ディスプレイ
640p (iPhone Retina Display)iPhone Retinaディスプレイ
768p (iPad)iPadディスプレイ。
Graphics LevelOpenGLバージョン。
OpenGL ES 1.xOpenGL ES 1.xバージョン。
OpenGL ES 2.0OpenGL ES 2.0。
Accelerometer Frequency加速度計のサンプリング頻度。
Disabled加速度はサンプリングされません。
15Hz毎秒15サンプル。
30Hz毎秒30サンプル。
60Hz毎秒60サンプル。
100Hz毎秒100サンプル。
Override iPod Musicオンの場合、アプリケーションはユーザーのiPodの音楽を消音します。オフの場合、ユーザーのiPodの音楽はバックグラウンドで再生され続けます。
UI Requires Persistent WiFiアプリケーションで、Wi-Fi接続が必要かどうかを指定。アプリケーションの実行中にiOSがアクティブなWi-Fi接続を切断せずに維持します。
Exit on SuspendマルチタスクをサポートするiOSのバージョンの場合、バックグラウンドにサスペンドされた際にアプリケーションが終了するかを指定。
Scripting Define Symbolsカスタム コンパイル フラグ(詳細はプラットフォーム依存のコンパイル を参照)。
Optimization(最適化)
Api Compatibility Levelアクティブ.NET APIのプロフィールを指定。
.Net 2.0.Net 2.0ライブラリ。.Net互換性が最大、ファイルサイズ最大。
.Net 2.0 Subset.NET互換性は全体の一部、ファイルサイズは小さく。
AOT compilation options追加のAOTコンパイラ オプション。
SDK VersionXcodeでビルドするiPhone OSのSDKバージョンを指定
Device SDK実際のハードウェア上で実行するためのSDK。
Simulator SDKシミュレータ上でのみ実行するためのSDK。
Target iOS Version最終的なアプリケーションを実行することができる最も古いiOSのバージョンを指定し、iOS4.0-6.0の範囲。
Stripping Level (Unity Proのみ)ビルドされたプレーヤーのファイル容量を小さくするためスクリプト機能の一部を削減するオプション(この設定はiOSとAndroidプラットフォームの間で共有されます)
Disabled削減は行われません。
Strip Assembliesレベル1の削減。
Strip ByteCodeレベル2の削減(レベル1からの削減を含む)。
Use micro mscorlibレベル3の削減(レベル1、2からの削減を含む)。
Script Call Optimization実行時に速度向上のために例外処理を無効にするオプション。
Slow and Safe完全な例外処理がデバイス上で行われ、若干パフォーマンスに影響します。
Fast but no Exceptionsデバイス上の例外データが提供されず、ゲーム実行速度を高めます。
Optimize Mesh Dataメッシュについて、それが適用されたマテリアルが必要としていないデータを全て取り除く(接線、法線、色、UV)。

注意: 例えば、iPhone OS 3.2用にビルドし、Xcode上でSimulator 3.2を選択した場合、エラーが大量に発生します。Unityエディタで必ず 適切なターゲットSDKを選択してください。

Android

Resolution And Presentation(解像度およびスクリーン)


''Androidビルドのプロジェクト向けの解像度とスクリーン'
Resolution
Default Orientation(このプロパティは、iOSおよびAndroid間で共有されます。)
PortraitデバイスはPortraitモード、持ち方は縦向きでホームボタンが下。
Portrait Upside DownデバイスはPortraitモードで逆さま、持ち方は縦向きでホームボタンが上。
Landscape RightデバイスはLandscapeモード、持ち方は横向きでホームボタンが
Landscape LeftデバイスはLandscapeモード、持ち方は横向きでホームボタンが
Use 32-bit Display Bufferディスプレイバッファが32ビットカラー値(デフォルトでは16ビット)を保持するために作成するかを指定。。使用するのは、バンディングが見られる場合や、ImageEffects(画像効果)でアルファを必要とする場合であり、理由はディスプレイバッファと同じ形式にRTを作成するためです。Gingerbread以前のOSではサポートされてません(強制的に16ビットになります)。
Use 24-bit Depth Buffer(少なくとも)24ビットカラー値を保持するためディスプレイバッファを作成するよう指定。パフォーマンスに影響を及ぼす可能性があるので、'z-fighting'やその他の画像の乱れがある場合のみ使用して下さい。
 
 
Icon(アイコン)

プロジェクトをビルドしたとき保持するさまざまなアイコン。
Override for AndroidAndroidゲームで使用したいカスタムアイコンを割り当てる場合、オンにします。異なるサイズのアイコンを、以下の箱の中に収めます。
 
 
Splash Image(スプラッシュ画像)

プロジェクト起動時に表示されるスプラッシュ画像。
Mobile Splash Screen (Unity Proのみ)iOSのスプラッシュ画面に使用されるべきテクスチャを指定。標準のスプラッシュ画面サイズは320×480になります。(これは、AndroidとiOSの間で共有されます)
Splash Scalingデバイス上のスプラッシュ画像の拡大・縮小の方法を指定します。
 
 
Other Settings(他の設定)
Rendering
Static BatchingビルドでStatic Batchを使用する場合に設定(WebPlayerではデフォルトで無効)。Unity Proのみ。
Dynamic BatchingビルドでDynamic Batchを使用する場合に設定(デフォルトで有効) 。 
Identification
Bundle Identifierお使いのApple Developer Networkのアカウントからプロビジョニング証明書で使用される文字列(これはiOSとAndroidの間で共有されます)。
Bundle Versionバンドルのビルドバージョン番号を指定、ビルドバージョンが上がったことを示す(リリースされたかどうかにかかわらず)。単調に増加する、ピリオドで区切られた一つ以上の数字。(これは、iOSとAndroidの間で共有されます)
Bundle Version Code内部バージョン番号。この番号はひとつのバージョンが、別のバージョンより新しいかどうかを判断するための数字で、高いほうが新しいです。ユーザーに表示するバージョン番号ではなく、その番号はversionName属性によって設定されます。値は "100"のように、整数として設定しなければなりません。次のバージョンが、より高い数値であるかぎり好きに定義することができます。例えば、ビルド番号でも問題ありません。あるいは"x.y"の形式として、でバージョン番号または、下限と上限の16ビットで個別に "x"と "y"をエンコードし、整数に変換することができます。それとも、単に新しいバージョンをリリースするたびに、1つ数を増やすことができます。
Minimum API Levelビルドをサポートするのに最低限必要なAPIの最小バージョン。
Configuration
Graphics LevelES 1.1( "固定機能")またはES 2.0( 'シェーダベース')のOpen GLバージョンのいずれかを選択します。AVD(エミュレータ)を使用する場合はES 1.xのみサポートされています。
Install Locationアプリケーションをデバイス上にインストールするロケーションを指定(詳細については、http://developer.android.com/guide/appendix/install-location.htmlを参照)。
AutomaticOSで自動判断。ユーザーが後からアプリを相互に移動することができます。
Prefer External可能であれば外部ストレージ(SDカード)にアプリをインストールします。OSでは保証されないため、出来ない場合はアプリは内部メモリにインストールされます。
Force Internal強制的に内部メモリにアプリをインストールします。ユーザーはアプリを外部ストレージに移動することができません。
Internet AccessRequireにすると、スクリプトがこれを使用していない場合でも、ネットワークのアクセス許可が有効になります。開発ビルドでは、自動的に有効化されます。
Write AccessExternal (SDCard)に設定すると、SDカードなどの外部記憶装置への書き込みアクセスを可能にします。開発ビルドでは、自動的に有効化されます。
Scripting Define Symbolsカスタム コンパイル フラグ(詳細はプラットフォーム依存のコンパイル を参照)。
Optimization
Api Compatibility Levelアクティブな.NET API プロファイルを指定。
.Net 2.0.Net 2.0ライブラリ。.Net互換性が最大、ファイルサイズ最大。
.Net 2.0 Subset.NET互換性は全体の一部、ファイルサイズは小さく。
Stripping Level (Unity Proのみ)ビルドされたプレーヤーのファイル容量を小さくするためスクリプト機能の一部を削減するオプション(この設定はiOSとAndroidプラットフォームの間で共有されます)
Disabled削減は行われません。
Strip Assembliesレベル1の削減。
Strip ByteCodeレベル2の削減(レベル1からの削減を含む)。
Use micro mscorlibレベル3の削減(レベル1、2からの削減を含む)。
Enable "logcat" profilerプロジェクトのテスト時に、デバイスからフィードバックを取得したい場合はこれを有効にしてください。adbのlogcatを、デバイスからコンソール(開発ビルドでのみ使用可能)にログを出力します。
Optimize Mesh Dataメッシュについて、それが適用されたマテリアルが必要としていないデータを全て取り除く(接線、法線、色、UV)。
Publishing Settings(公開設定)

Androidマーケット向けの公開設定
Keystore
Use Existing Keystore / Create New Keystore新しいキーストアを作成するか、既存のものを使用するか、選択するために使用します。
Browse Keystore既存のキーストアを選択します。
Keystore passwordキーストアのパスワード。
Confirm passwordパスワード確認、Create New Keystoreオプションが選択された場合にのみ有効。
Key
Aliasキーのエイリアス。
Passwordキーエイリアスのパスワード。
Split Application BinaryアプリケーションをExpansion Fileに分割するためのフラグ。Google Playストアで最終ビルドが50MB超えたときにかぎり便利です。

セキュリティ上の理由から、Unityがキーストアのパスワードも、キーのパスワードも、保存しないことに注意してください。また、署名はUnityのプレーヤーの設定から行う必要がありますので注意下さい、jarsignerを使用した場合は機能しません。

Flash

Resolution And Presentation(解像度とスクリーン)

Resolution
Default Screen Width生成されるPlayerのスクリーン幅。
Default Screen Height生成されるPlayerのスクリーン高さ。

Other Settings

Optimization
Strippingビルドの際にバイトコードを削減するオプションです。
Strip Physics Code必要でない場合に物理エンジンのコードを削減します。
Optimize Mesh Dataメッシュについて、それが適用されたマテリアルが必要としていないデータを全て取り除く(接線、法線、色、UV)

Google Native Client

Resolution and Presentation(解像度とスクリーン)

Resolution
Default Screen Width生成されるPlayerのスクリーン幅。
Default Screen Height生成されるPlayerのスクリーン高さ。

Icon


プロジェクトをビルドしたとき保持するさまざまなアイコン。''
Override for Webネイティブクライアントのゲームで使用したいカスタムアイコンを割り当てる場合、オンにします。異なるサイズのアイコンを、以下の箱の中に収めます。

Other Settings

Rendering
Static BatchingビルドでStatic Batchを使用する場合に設定(WebPlayerではデフォルトで無効))。Unity Proのみ。
Dynamic BatchingビルドでDynamic Batchを使用する場合に設定(デフォルトで有効) 。
Configuration
Scripting Define Symbolsカスタム コンパイル フラグ(詳細はplatform dependent compilationのページ を参照)。
Optimization
API Compatibility Level
.Net 2.0.Net 2.0ライブラリ。.Net互換性が最大、ファイルサイズ最大。
.Net 2.0 Subset.NET互換性は全体の一部、ファイルサイズは小さく。
Strip Physics Code必要でない場合に物理エンジンのコードを削減します。
Optimize Mesh Dataメッシュについて、それが適用されたマテリアルが必要としていないデータを全て取り除く(接線、法線、色、UV)。

Details

Desktop

Player Settingsウィンドウで、多くの技術的な設定のデフォルト値がセットされています。Quality Settings も参照して、さまざまなグラフィックの品質レベルを設定できることを確認下さい。

WebPlayerの公開

Default Web Screen WidthおよびDefault Web Screen Heightで、htmlファイルが使用するサイズを決定します。後からhtmlファイルでサイズを変更できます。

Default Screen WidthおよびDefault Screen Height は、Web Player実行時にコンテキストメニューからフルスクリーンモードに入るときに、Web Playerで使用されます。

Resolution(解像度)ダイアログのカスタマイズ


エンドユーザに表示されるResolution(解像度)ダイアログ

スタンドアロン プレーヤーのScreen Resolution(画面解像度)ダイアログにカスタムバナー画像を追加するオプションがあります。画像の最大サイズは432×163ピクセルです。画像は、画面セレクタに合わせて拡大されません。代わりに、センタリングしてトリミングされます。

MacのApp Storeへの公開

Use Player Log(プレイヤーログの使用)によりデバッグ情報を含むログファイルの書き込みを有効にします。これはゲームに問題がある場合に、何が起こったかを調べるのに便利です。AppleのMac App Storeのゲームを公開するとき、これをオフにすることを推奨します、そうしなければAppleが提出を拒否することがあります。ログファイルの詳細についてはthis manual page を参照下さい。

Use Mac App Store ValidationでMacのApp StoreのReceipt Validationが有効になります。有効にした場合は、Mac App Storeから有効な領収書(Receipt)が含まれている場合のみゲームが実行されます。App Storeで公開するためにAppleにゲームを提出する際に、これを使用します。これにより、購入したもの以外のどれかのコンピュータ上でゲームが実行されることを防ぐことができます。この機能は、強力なコピー防止機能はまったく実装していないことに注意してください。特に、1つのUnityのゲームに対する潜在的なクラックは、他のUnityコンテンツに対しても有効となります。このような理由から、Unityのプラグイン機能を使って独自の領収書のValidationコードを実装し、この設定とあわせて使用することを推奨します。しかし、Appleで画面設定ダイアログを表示する前に初期的にプラグインによるValidationを行うことが必要であることから、このオプションをオンにするべきであり、そうしないとAppleが提出を拒否することがあります。

iOS

バンドル識別子

Bundle Identifierの文字列は、ビルドしたゲームのプロビジョニングプロファイルと一致する必要があります。識別子の基本的な構成はcom.CompanyName.GameNameです。この構成は、あなたが居住している国によって異なりますので、必ずあなたの開発者アカウントでアップルから提供された文字列をデフォルトとして下さい。GameNameは、AppleのiPhone Developer CenterのWebサイトから管理できるプロビジョニング証明書でセットアップされています。どう実行されるかの詳細についてはApple iPhone Developer Center website を参照下さい。

Stripping Level(Unity Proのみ)

ほとんどのゲームでは必要なすべてのDLLを使用しません。このオプションを使用すると、使用されない部分を削減して、iOSデバイス上のビルドしたプレーヤーのファイル容量を減らすことができます。もしこのオプションにより通常削減されるクラスがゲームで使用されている場合は、ビルド時にデバッグメッセージが表示されます。

スクリプト呼び出しの最適化

iOSで良い開発を実践するには、例外処理(内部的なもの、またはtry /catchブロックを使用したもの)に依存しないことです。デフォルトのSlow and Safe`オプションを使用する場合は、デバイス上で発生した例外がキャッチされ、スタックトレースが提供されます。Fast but no Exceptionsオプションを使用する場合、例外が発生するとゲームがクラッシュし、スタックトレースが提供されません。しかし、プロセッサが例外処理をする必要がないため、ゲームの実行速度が速くなります。ゲームを一般に向けてリリースするときはFast but no Exceptionsオプションを使用して公開することが最善です。

Android

バンドル識別子

Bundle Identifierの文字列は、Androidマーケットに公開され、デバイス上にインストールされたときのアプリケーションの一意の名前です。識別子の基本的な構成はcom.CompanyName.GameNameで、任意に選ぶことが出来ます。Unityではこのフィールドは、利便性のためにiOS Player Settingsと共有されています。

Stripping Level(Unity Proのみ)

ほとんどのゲームでは必要なすべてのDLLを使用しません。このオプションを使用すると、使用されない部分を削減して、Androidデバイス上のビルドしたプレーヤーのファイル容量を減らすことができます。

Page last updated: 2012-11-24



android-API

Unity Android provides a number of scripting APIs unified with iOS APIs to access handheld device functionality. For cross-platform projects, UNITY_ANDROID is defined for conditionally compiling Android-specific C# code. The following scripting classes contain Android-related changes (some of the API is shared between Android and iOS):

InputAccess to multi-touch screen, accelerometer and device orientation.
iPhoneSettingsSome of the Android settings, such as screen orientation, dimming and information about device hardware.
iPhoneKeyboardSupport for native on-screen keyboard.
iPhoneUtilsUseful functions for movie playback, anti-piracy protection and vibration.

Further Reading

Page last updated: 2010-09-09



Android-Input

Desktop

Note: Keyboard, joystick and gamepad input work on the desktop versions of Unity (including webplayer and Flash) but not on mobiles.

Unity supports keyboard, joystick and gamepad input.

Virtual axes and buttons can be created in the Input Manager, and end users can configure Keyboard input in a nice screen configuration dialog.

You can setup joysticks, gamepads, keyboard, and mouse, then access them all through one simple scripting interface.

From scripts, all virtual axes are accessed by their name.

Every project has the following default input axes when it's created:

  • Horizontal and Vertical are mapped to w, a, s, d and the arrow keys.
  • Fire1, Fire2, Fire3 are mapped to Control, Option (Alt), and Command, respectively.
  • Mouse X and Mouse Y are mapped to the delta of mouse movement.
  • Window Shake X and Window Shake Y is mapped to the movement of the window.

Adding new Input Axes

If you want to add new virtual axes go to the Edit->Project Settings->Input menu. Here you can also change the settings of each axis.

You map each axis to two buttons on a joystick, mouse, or keyboard keys.

NameThe name of the string used to check this axis from a script.
Descriptive NamePositive value name displayed in the input tab of the Configuration dialog for standalone builds.
Descriptive Negative NameNegative value name displayed in the Input tab of the Configuration dialog for standalone builds.
Negative ButtonThe button used to push the axis in the negative direction.
Positive ButtonThe button used to push the axis in the positive direction.
Alt Negative ButtonAlternative button used to push the axis in the negative direction.
Alt Positive ButtonAlternative button used to push the axis in the positive direction.
GravitySpeed in units per second that the axis falls toward neutral when no buttons are pressed.
DeadSize of the analog dead zone. All analog device values within this range result map to neutral.
SensitivitySpeed in units per second that the the axis will move toward the target value. This is for digital devices only.
SnapIf enabled, the axis value will reset to zero when pressing a button of the opposite direction.
InvertIf enabled, the Negative Buttons provide a positive value, and vice-versa.
TypeThe type of inputs that will control this axis.
AxisThe axis of a connected device that will control this axis.
Joy NumThe connected Joystick that will control this axis.

Use these settings to fine tune the look and feel of input. They are all documented with tooltips in the Editor as well.

Using Input Axes from Scripts

You can query the current state from a script like this:

value = Input.GetAxis ("Horizontal");

An axis has a value between -1 and 1. The neutral position is 0. This is the case for joystick input and keyboard input.

However, Mouse Delta and Window Shake Delta are how much the mouse or window moved during the last frame. This means it can be larger than 1 or smaller than -1 when the user moves the mouse quickly.

It is possible to create multiple axes with the same name. When getting the input axis, the axis with the largest absolute value will be returned. This makes it possible to assign more than one input device to one axis name. For example, create one axis for keyboard input and one axis for joystick input with the same name. If the user is using the joystick, input will come from the joystick, otherwise input will come from the keyboard. This way you don't have to consider where the input comes from when writing scripts.

Button Names

To map a key to an axis, you have to enter the key's name in the Positive Button or Negative Button property in the Inspector.

The names of keys follow this convention:

  • Normal keys: "a", "b", "c" ...
  • Number keys: "1", "2", "3", ...
  • Arrow keys: "up", "down", "left", "right"
  • Keypad keys: "[1]", "[2]", "[3]", "[+]", "[equals]"
  • Modifier keys: "right shift", "left shift", "right ctrl", "left ctrl", "right alt", "left alt", "right cmd", "left cmd"
  • Mouse Buttons: "mouse 0", "mouse 1", "mouse 2", ...
  • Joystick Buttons (from any joystick): "joystick button 0", "joystick button 1", "joystick button 2", ...
  • Joystick Buttons (from a specific joystick): "joystick 1 button 0", "joystick 1 button 1", "joystick 2 button 0", ...
  • Special keys: "backspace", "tab", "return", "escape", "space", "delete", "enter", "insert", "home", "end", "page up", "page down"
  • Function keys: "f1", "f2", "f3", ...

The names used to identify the keys are the same in the scripting interface and the Inspector.

value = Input.GetKey ("a");

Mobile Input

On iOS and Android, the Input class offers access to touchscreen, accelerometer and geographical/location input.

Access to keyboard on mobile devices is provided via the iOS keyboard.

Multi-Touch Screen

The iPhone and iPod Touch devices are capable of tracking up to five fingers touching the screen simultaneously. You can retrieve the status of each finger touching the screen during the last frame by accessing the Input.touches property array.

Android devices don't have a unified limit on how many fingers they track. Instead, it varies from device to device and can be anything from two-touch on older devices to five fingers on some newer devices.

Each finger touch is represented by an Input.Touch data structure:

fingerIdThe unique index for a touch.
positionThe screen position of the touch.
deltaPositionThe screen position change since the last frame.
deltaTimeAmount of time that has passed since the last state change.
tapCountThe iPhone/iPad screen is able to distinguish quick finger taps by the user. This counter will let you know how many times the user has tapped the screen without moving a finger to the sides. Android devices do not count number of taps, this field is always 1.
phaseDescribes so called "phase" or the state of the touch. It can help you determine if the touch just began, if user moved the finger or if he just lifted the finger.

Phase can be one of the following:

BeganA finger just touched the screen.
MovedA finger moved on the screen.
StationaryA finger is touching the screen but hasn't moved since the last frame.
EndedA finger was lifted from the screen. This is the final phase of a touch.
CanceledThe system cancelled tracking for the touch, as when (for example) the user puts the device to her face or more than five touches happened simultaneously. This is the final phase of a touch.

Following is an example script which will shoot a ray whenever the user taps on the screen:

var particle : GameObject;
function Update () {
	for (var touch : Touch in Input.touches) {
		if (touch.phase == TouchPhase.Began) {
			// Construct a ray from the current touch coordinates
			var ray = Camera.main.ScreenPointToRay (touch.position);
			if (Physics.Raycast (ray)) {
				// Create a particle if hit
				Instantiate (particle, transform.position, transform.rotation);
			}
		}
	}
}

Mouse Simulation

On top of native touch support Unity iOS/Android provides a mouse simulation. You can use mouse functionality from the standard Input class.

Device Orientation

Unity iOS/Android allows you to get discrete description of the device physical orientation in three-dimensional space. Detecting a change in orientation can be useful if you want to create game behaviors depending on how the user is holding the device.

You can retrieve device orientation by accessing the Input.deviceOrientation property. Orientation can be one of the following:

UnknownThe orientation of the device cannot be determined. For example when device is rotate diagonally.
PortraitThe device is in portrait mode, with the device held upright and the home button at the bottom.
PortraitUpsideDownThe device is in portrait mode but upside down, with the device held upright and the home button at the top.
LandscapeLeftThe device is in landscape mode, with the device held upright and the home button on the right side.
LandscapeRightThe device is in landscape mode, with the device held upright and the home button on the left side.
FaceUpThe device is held parallel to the ground with the screen facing upwards.
FaceDownThe device is held parallel to the ground with the screen facing downwards.

Accelerometer

As the mobile device moves, a built-in accelerometer reports linear acceleration changes along the three primary axes in three-dimensional space. Acceleration along each axis is reported directly by the hardware as G-force values. A value of 1.0 represents a load of about +1g along a given axis while a value of -1.0 represents -1g. If you hold the device upright (with the home button at the bottom) in front of you, the X axis is positive along the right, the Y axis is positive directly up, and the Z axis is positive pointing toward you.

You can retrieve the accelerometer value by accessing the Input.acceleration property.

The following is an example script which will move an object using the accelerometer:

var speed = 10.0;
function Update () {
	var dir : Vector3 = Vector3.zero;

	// we assume that the device is held parallel to the ground
	// and the Home button is in the right hand

	// remap the device acceleration axis to game coordinates:
	//  1) XY plane of the device is mapped onto XZ plane
	//  2) rotated 90 degrees around Y axis
	dir.x = -Input.acceleration.y;
	dir.z = Input.acceleration.x;

	// clamp acceleration vector to the unit sphere
	if (dir.sqrMagnitude > 1)
		dir.Normalize();

	// Make it move 10 meters per second instead of 10 meters per frame...
	dir *= Time.deltaTime;

	// Move object
	transform.Translate (dir * speed);
}

Low-Pass Filter

Accelerometer readings can be jerky and noisy. Applying low-pass filtering on the signal allows you to smooth it and get rid of high frequency noise.

The following script shows you how to apply low-pass filtering to accelerometer readings:

var AccelerometerUpdateInterval : float = 1.0 / 60.0;
var LowPassKernelWidthInSeconds : float = 1.0;

private var LowPassFilterFactor : float = AccelerometerUpdateInterval / LowPassKernelWidthInSeconds; // tweakable
private var lowPassValue : Vector3 = Vector3.zero;
function Start () {
	lowPassValue = Input.acceleration;
}

function LowPassFilterAccelerometer() : Vector3 {
	lowPassValue = Mathf.Lerp(lowPassValue, Input.acceleration, LowPassFilterFactor);
	return lowPassValue;
}

The greater the value of LowPassKernelWidthInSeconds, the slower the filtered value will converge towards the current input sample (and vice versa). You should be able to use the LowPassFilter() function instead of avgSamples().

I'd like as much precision as possible when reading the accelerometer. What should I do?

Reading the Input.acceleration variable does not equal sampling the hardware. Put simply, Unity samples the hardware at a frequency of 60Hz and stores the result into the variable. In reality, things are a little bit more complicated -- accelerometer sampling doesn't occur at consistent time intervals, if under significant CPU loads. As a result, the system might report 2 samples during one frame, then 1 sample during the next frame.

You can access all measurements executed by accelerometer during the frame. The following code will illustrate a simple average of all the accelerometer events that were collected within the last frame:

var period : float = 0.0;
var acc : Vector3 = Vector3.zero;
for (var evnt : iPhoneAccelerationEvent  in iPhoneInput.accelerationEvents) {
	acc += evnt.acceleration * evnt.deltaTime;
	period += evnt.deltaTime;
}
if (period > 0)
	acc *= 1.0/period;
return acc;

Further Reading

The Unity mobile input API is originally based on Apple's API. It may help to learn more about the native API to better understand Unity's Input API. You can find the Apple input API documentation here:

Note: The above links reference your locally installed iPhone SDK Reference Documentation and will contain native ObjectiveC code. It is not necessary to understand these documents for using Unity on mobile devices, but may be helpful to some!

iOS

Device geographical location

Device geographical location can be obtained via the iPhoneInput.lastLocation property. Before calling this property you should start location service updates using iPhoneSettings.StartLocationServiceUpdates() and check the service status via iPhoneSettings.locationServiceStatus. See the scripting reference for details.

Page last updated: 2012-06-28



Android-Keyboard

In most cases, Unity will handle keyboard input automatically for GUI elements but it is also easy to show the keyboard on demand from a script.

iOS

Using the Keyboard

GUI Elements

The keyboard will appear automatically when a user taps on editable GUI elements. Currently, GUI.TextField, GUI.TextArea and GUI.PasswordField will display the keyboard; see the GUI class documentation for further details.

Manual Keyboard Handling

Use the iPhoneKeyboard.Open function to open the keyboard. Please see the iPhoneKeyboard scripting reference for the parameters that this function takes.

Keyboard Type Summary

The Keyboard supports the following types:

iPhoneKeyboardType.DefaultLetters. Can be switched to keyboard with numbers and punctuation.
iPhoneKeyboardType.ASCIICapableLetters. Can be switched to keyboard with numbers and punctuation.
iPhoneKeyboardType.NumbersAndPunctuationNumbers and punctuation. Can be switched to keyboard with letters.
iPhoneKeyboardType.URLLetters with slash and .com buttons. Can be switched to keyboard with numbers and punctuation.
iPhoneKeyboardType.NumberPadOnly numbers from 0 to 9.
iPhoneKeyboardType.PhonePadKeyboard used to enter phone numbers.
iPhoneKeyboardType.NamePhonePadLetters. Can be switched to phone keyboard.
iPhoneKeyboardType.EmailAddressLetters with @ sign. Can be switched to keyboard with numbers and punctuation.

Text Preview

By default, an edit box will be created and placed on top of the keyboard after it appears. This works as preview of the text that user is typing, so the text is always visible for the user. However, you can disable text preview by setting iPhoneKeyboard.hideInput to true. Note that this works only for certain keyboard types and input modes. For example, it will not work for phone keypads and multi-line text input. In such cases, the edit box will always appear. iPhoneKeyboard.hideInput is a global variable and will affect all keyboards.

Keyboard Orientation

By default, the keyboard automatically follows the device orientation. To disable or enable rotation to a certain orientation, use the following properties available in iPhoneKeyboard:

autorotateToPortraitEnable or disable autorotation to portrait orientation (button at the bottom).
autorotateToPortraitUpsideDownEnable or disable autorotation to portrait orientation (button at top).
autorotateToLandscapeLeftEnable or disable autorotation to landscape left orientation (button on the right).
autorotateToLandscapeRightEnable or disable autorotation to landscape right orientation (button on the left).

Visibility and Keyboard Size

There are three keyboard properties in iPhoneKeyboard that determine keyboard visibility status and size on the screen.

visibleReturns true if the keyboard is fully visible on the screen and can be used to enter characters.
areaReturns the position and dimensions of the keyboard.
activeReturns true if the keyboard is activated. This property is not static property. You must have a keyboard instance to use this property.

Note that iPhoneKeyboard.area will return a rect with position and size set to 0 until the keyboard is fully visible on the screen. You should not query this value immediately after iPhoneKeyboard.Open. The sequence of keyboard events is as follows:

  • iPhoneKeyboard.Open is called. iPhoneKeyboard.active returns true. iPhoneKeyboard.visible returns false. iPhoneKeyboard.area returns (0, 0, 0, 0).
  • Keyboard slides out into the screen. All properties remain the same.
  • Keyboard stops sliding. iPhoneKeyboard.active returns true. iPhoneKeyboard.visible returns true. iPhoneKeyboard.area returns real position and size of the keyboard.

Secure Text Input

It is possible to configure the keyboard to hide symbols when typing. This is useful when users are required to enter sensitive information (such as passwords). To manually open keyboard with secure text input enabled, use the following code:

iPhoneKeyboard.Open("", iPhoneKeyboardType.Default, false, false, true);

Hiding text while typing

Alert keyboard

To display the keyboard with a black semi-transparent background instead of the classic opaque, call iPhoneKeyboard.Open as follows:

iPhoneKeyboard.Open("", iPhoneKeyboardType.Default, false, false, true, true);

Classic keyboard

Alert keyboard

Android

Unity Android reuses the iOS API to display system keyboard. Even though Unity Android supports most of the functionality of its iPhone counterpart, there are two aspects which are not supported:

  • iPhoneKeyboard.hideInput
  • iPhoneKeyboard.area

Please also note that the layout of a iPhoneKeyboardType can differ somewhat between devices.

Page last updated: 2011-11-03



Android-Advanced

iOS

Advanced iOS scripting

Determining Device Generation

Different device generations support different functionality and have widely varying performance. You should query the device's generation and decide which functionality should be disabled to compensate for slower devices.

You can find the device generation from the iPhone.generation property. The reported generation can be one of the following:

  • iPhone
  • iPhone3G
  • iPhone3GS
  • iPhone4
  • iPodTouch1Gen
  • iPodTouch2Gen
  • iPodTouch3Gen
  • iPodTouch4Gen
  • iPad1Gen

You can find more information about different device generations, performance and supported functionality in our iPhone Hardware Guide.

Device Properties

There are a number of device-specific properties that you can access:-

SystemInfo.deviceUniqueIdentifierUnique device identifier.
SystemInfo.deviceNameUser specified name for device.
SystemInfo.deviceModelIs it iPhone or iPod Touch?
SystemInfo.operatingSystemOperating system name and version.

Anti-Piracy Check

Pirates will often hack an application from the AppStore (by removing Apple DRM protection) and then redistribute it for free. Unity iOS comes with an anti-piracy check which allows you to determine if your application was altered after it was submitted to the AppStore.

You can check if your application is genuine (not-hacked) with the Application.genuine property. If this property returns false then you might notify the user that he is using a hacked application or maybe disable access to some functions of your application.

Note: accessing the Application.genuine property is a fairly expensive operation and so you shouldn't do it during frame updates or other time-critical code.

Vibration Support

You can trigger a vibration by calling Handheld.Vibrate. Note that iPod Touch devices lack vibration hardware and will just ignore this call.

Android

Advanced Android scripting

Determining Device Generation

Different Android devices support different functionality and have widely varying performance. You should target specific devices or device families and decide which functionality should be disabled to compensate for slower devices. There are a number of device specific properties that you can access to which device is being used.

Note: Android Marketplace does some additional compatibility filtering, so you should not be concerned if an ARMv7-only app optimised for OGLES2 is offered to some old slow devices.

Device Properties

SystemInfo.deviceUniqueIdentifierUnique device identifier.
SystemInfo.deviceNameUser specified name for device.
SystemInfo.deviceModelIs it iPhone or iPod Touch?
SystemInfo.operatingSystemOperating system name and version.

Anti-Piracy Check

Pirates will often hack an application (by removing Apple DRM protection) and then redistribute it for free. Unity Android comes with an anti-piracy check which allows you to determine if your application was altered after it was submitted to the AppStore.

You can check if your application is genuine (not-hacked) with the Application.genuine property. If this property returns false then you might notify user that he is using a hacked application or maybe disable access to some functions of your application.

Note: Application.genuineCheckAvailable should be used along with Application.genuine to verify that application integrity can actually be confirmed. Accessing the Application.genuine property is a fairly expensive operation and so you shouldn't do it during frame updates or other time-critical code.

Vibration Support

You can trigger a vibration by calling Handheld.Vibrate. However, devices lacking vibration hardware will just ignore this call.

Page last updated: 2012-07-12



Android-DotNet

iOS

Now Unity iOS supports two .NET API compatibility levels: .NET 2.0 and a subset of .NET 2.0 .You can select the appropriate level in the Player Settings.

.NET API 2.0

Unity supports the .NET 2.0 API profile. This is close to the full .NET 2.0 API and offers the best compatibility with pre-existing .NET code. However, the application's build size and startup time will be relatively poor.

Note: Unity iOS does not support namespaces in scripts. If you have a third party library supplied as source code then the best approach is to compile it to a DLL outside Unity and then drop the DLL file into your project's Assets folder.

.NET 2.0 Subset

Unity also supports the .NET 2.0 Subset API profile. This is close to the Mono "monotouch" profile, so many limitations of the "monotouch" profile also apply to Unity's .NET 2.0 Subset profile. More information on the limitations of the "monotouch" profile can be found here. The advantage of using this profile is reduced build size (and startup time) but this comes at the expense of compatibility with existing .NET code.

Android

Unity Android supports two .NET API compatibility levels: .NET 2.0 and a subset of .NET 2.0 You can select the appropriate level in the Player Settings.

.NET API 2.0

Unity supports the .NET 2.0 API profile; It is close to the full .NET 2.0 API and offers the best compatibility with pre-existing .NET code. However, the application's build size and startup time will be relatively poor.

Note: Unity Android does not support namespaces in scripts. If you have a third party library supplied as source code then the best approach is to compile it to a DLL outside Unity and then drop the DLL file into your project's Assets folder.

.NET 2.0 Subset

Unity also supports the .NET 2.0 Subset API profile. This is close to the Mono "monotouch" profile, so many limitations of the "monotouch" profile also apply to Unity's .NET 2.0 Subset profile. More information on the limitations of the "monotouch" profile can be found here. The advantage of using this profile is reduced build size (and startup time) but this comes at the expense of compatibility with existing .NET code.

Page last updated: 2012-07-12



Android-Plugins

This page describes Native Code Plugins for Android.

Building a Plugin for Android

To build a plugin for Android, you should first obtain the Android NDK and familiarize yourself with the steps involved in building a shared library.

If you are using C++ (.cpp) to implement the plugin you must ensure the functions are declared with C linkage to avoid name mangling issues.

extern "C" {
  float FooPluginFunction ();
} 

Using Your Plugin from C#

Once built, the shared library should be copied to the Assets->Plugins->Android folder. Unity will then find it by name when you define a function like the following in the C# script:-

[DllImport ("PluginName")]
private static extern float FooPluginFunction (); 

Please note that PluginName should not include the prefix ('lib') nor the extension ('.so') of the filename. It is advisable to wrap all native code methods with an additional C# code layer. This code should check Application.platform and call native methods only when the app is running on the actual device; dummy values can be returned from the C# code when running in the Editor. You can also use platform defines to control platform dependent code compilation.

Deployment

For cross platform deployment, your project should include plugins for each supported platform (ie, libPlugin.so for Android, Plugin.bundle for Mac and Plugin.dll for Windows). Unity automatically picks the right plugin for the target platform and includes it with the player.

Using Java Plugins

The Android plugin mechanism also allows Java to be used to enable interaction with the Android OS.

Building a Java Plugin for Android

There are several ways to create a Java plugin but the result in each case is that you end up with a .jar file containing the .class files for your plugin. One approach is to download the JDK, then compile your .java files from the command line with javac. This will create .class files which you can then package into a .jar with the jar command line tool. Another option is to use the Eclipse IDE together with the ADT.

Using Your Java Plugin from Native Code

Once you have built your Java plugin (.jar) you should copy it to the Assets->Plugins->Android folder in the Unity project. Unity will package your .class files together with the rest of the Java code and then access the code using the Java Native Interface (JNI). JNI is used both when calling native code from Java and when interacting with Java (or the JavaVM) from native code.

To find your Java code from the native side you need access to the Java VM. Fortunately, that access can be obtained easily by adding a function like this to your C/C++ code:

jint JNI_OnLoad(JavaVM* vm, void* reserved) {
  JNIEnv* jni_env = 0;
  vm->AttachCurrentThread(&jni_env, 0);
} 

This is all that is needed to start using Java from C/C++. It is beyond the scope of this document to explain JNI completely. However, using it usually involves finding the class definition, resolving the constructor (<init>) method and creating a new object instance, as shown in this example:-

jobject createJavaObject(JNIEnv* jni_env) {
  jclass cls_JavaClass = jni_env->FindClass("com/your/java/Class");			// find class definition
  jmethodID mid_JavaClass = jni_env->GetMethodID (cls_JavaClass, "<init>",  "()V");		// find constructor method
  jobject obj_JavaClass = jni_env->NewObject(cls_JavaClass, mid_JavaClass);		// create object instance
  return jni_env->NewGlobalRef(obj_JavaClass);						// return object with a global reference
} 

Using Your Java Plugin with helper classes

AndroidJNIHelper and AndroidJNI can be used to ease some of the pain with raw JNI.

AndroidJavaObject and AndroidJavaClass automate a lot of tasks and also use cacheing to make calls to Java faster. The combination of AndroidJavaObject and AndroidJavaClass builds on top of AndroidJNI and AndroidJNIHelper, but also has a lot of logic in its own right (to handle the automation). These classes also come in a 'static' version to access static members of Java classes.

You can choose whichever approach you prefer, be it raw JNI through AndroidJNI class methods, or AndroidJNIHelper together with AndroidJNI and eventually AndroidJavaObject/AndroidJavaClass for maximum automation and convenience.

UnityEngine.AndroidJNI is a wrapper for the JNI calls available in C (as described above). All methods in this class are static and have a 1:1 mapping to the Java Native Interface. UnityEngine.AndroidJNIHelper provides helper functionality used by the next level, but is exposed as public methods because they may be useful for some special cases.

Instances of UnityEngine.AndroidJavaObject and UnityEngine.AndroidJavaClass have a 1:1 mapping to an instance of java.lang.Object and java.lang.Class (or subclasses thereof) on the Java side, respectively. They essentially provide 3 types of interaction with the Java side:

  • Call a method
  • Get the value of a field
  • Set the value of a field

The Call is separated into two categories: Call to a 'void' method, and Call to a method with non-void return type. A generic type is used to represent the return type of those methods which return a non-void type. The Get and Set always take a generic type representing the field type.

Example 1

//The comments describe what you would need to do if you were using raw JNI
 AndroidJavaObject jo = new AndroidJavaObject("java.lang.String", "some_string"); 
 // jni.FindClass("java.lang.String"); 
 // jni.GetMethodID(classID, "<init>", "(Ljava/lang/String;)V"); 
 // jni.NewStringUTF("some_string"); 
 // jni.NewObject(classID, methodID, javaString); 
 int hash = jo.Call<int>("hashCode"); 
 // jni.GetMethodID(classID, "hashCode", "()I"); 
 // jni.CallIntMethod(objectID, methodID);

Here, we're creating an instance of java.lang.String, initialized with a string of our choice and retrieving the hash value for that string.

The AndroidJavaObject constructor takes at least one parameter, the name of class for which we want to construct an instance. Any parameters after the class name are for the constructor call on the object, in this case the string "some_string". The subsequent Call to hashCode() returns an 'int' which is why we use that as the generic type parameter to the Call method.

Note: You cannot instantiate a nested Java class using dotted notation. Inner classes must use the $ separator, and it should work in both dotted and slashed format. So android.view.ViewGroup$LayoutParams or android/view/ViewGroup$LayoutParams can be used, where a LayoutParams class is nested in a ViewGroup class.

Example 2

One of the plugin samples above shows how to get the cache directory for the current application. This is how you would do the same thing from C# without any plugins:-

 AndroidJavaClass jc = new AndroidJavaClass("com.unity3d.player.UnityPlayer"); 
 // jni.FindClass("com.unity3d.player.UnityPlayer"); 
 AndroidJavaObject jo = jc.GetStatic<AndroidJavaObject>("currentActivity"); 
 // jni.GetStaticFieldID(classID, "Ljava/lang/Object;"); 
 // jni.GetStaticObjectField(classID, fieldID); 
 // jni.FindClass("java.lang.Object"); 

 Debug.Log(jo.Call<AndroidJavaObject>("getCacheDir").Call<string>("getCanonicalPath")); 
 // jni.GetMethodID(classID, "getCacheDir", "()Ljava/io/File;"); // or any baseclass thereof! 
 // jni.CallObjectMethod(objectID, methodID); 
 // jni.FindClass("java.io.File"); 
 // jni.GetMethodID(classID, "getCanonicalPath", "()Ljava/lang/String;"); 
 // jni.CallObjectMethod(objectID, methodID); 
 // jni.GetStringUTFChars(javaString);

In this case, we start with AndroidJavaClass instead of AndroidJavaObject because we want to access a static member of com.unity3d.player.UnityPlayer rather than create a new object (an instance is created automatically by the Android UnityPlayer). Then we access the static field "currentActivity" but this time we use AndroidJavaObject as the generic parameter. This is because the actual field type (android.app.Activity) is a subclass of java.lang.Object, and any non-primitive type must be accessed as AndroidJavaObject. The exceptions to this rule are strings, which can be accessed directly even though they don't represent a primitive type in Java.

After that it is just a matter of traversing the Activity through getCacheDir() to get the File object representing the cache directory, and then calling getCanonicalPath() to get a string representation.

Of course, nowadays you don't need to do that to get the cache directory since Unity provides access to the application's cache and file directory with Application.temporaryCachePath and Application.persistentDataPath.

Example 3

Finally, here is a trick for passing data from Java to script code using UnitySendMessage.

using UnityEngine; 
public class NewBehaviourScript : MonoBehaviour { 

	void Start () { 
		JNIHelper.debug = true; 
		using (JavaClass jc = new JavaClass("com.unity3d.player.UnityPlayer")) { 
			jc.CallStatic("UnitySendMessage", "Main Camera", "JavaMessage", "whoowhoo"); 
		} 
	} 

	void JavaMessage(string message) { 
		Debug.Log("message from java: " + message); 
	}
} 

The Java class com.unity3d.player.UnityPlayer now has a static method UnitySendMessage, equivalent to the iOS UnitySendMessage on the native side. It can be used in Java to pass data to script code.

Here though, we call it directly from script code, which essentially relays the message on the Java side. This then calls back to the native/Unity code to deliver the message to the object named "Main Camera". This object has a script attached which contains a method called "JavaMessage".

Best practice when using Java plugins with Unity

As this section is mainly aimed at people who don't have comprehensive JNI, Java and Android experience, we assume that the AndroidJavaObject/AndroidJavaClass approach has been used for interacting with Java code from Unity.

The first thing to note is that any operation you perform on an AndroidJavaObject or AndroidJavaClass is computationally expensive (as is the raw JNI approach). It is highly advisable to keep the number of transitions between managed and native/Java code to a minimum, for the sake of performance and also code clarity.

You could have a Java method to do all the actual work and then use AndroidJavaObject / AndroidJavaClass to communicate with that method and get the result. However, it is worth bearing in mind that the JNI helper classes try to cache as much data as possible to improve performance.

//The first time you call a Java function like 
AndroidJavaObject jo = new AndroidJavaObject("java.lang.String", "some_string");  // somewhat expensive
int hash = jo.Call<int>("hashCode");  // first time - expensive
int hash = jo.Call<int>("hashCode");  // second time - not as expensive as we already know the java method and can call it directly

The Mono garbage collector should release all created instances of AndroidJavaObject and AndroidJavaClass after use, but it is advisable to keep them in a using(){} statement to ensure they are deleted as soon as possible. Without this, you cannot be sure when they will be destroyed. If you set AndroidJNIHelper.debug to true, you will see a record of the garbage collector's activity in the debug output.

//Getting the system language with the safe approach
void Start () { 
	using (AndroidJavaClass cls = new AndroidJavaClass("java.util.Locale")) { 
		using(AndroidJavaObject locale = cls.CallStatic<AndroidJavaObject>("getDefault")) { 
			Debug.Log("current lang = " + locale.Call<string>("getDisplayLanguage")); 

		} 
	} 
}

You can also call the .Dispose() method directly to ensure there are no Java objects lingering. The actual C# object might live a bit longer, but will be garbage collected by mono eventually.

Extending the UnityPlayerActivity Java Code

With Unity Android it is possible to extend the standard UnityPlayerActivity class (the primary Java class for the Unity Player on Android, similar to AppController.mm on Unity iOS).

An application can override any and all of the basic interaction between Android OS and Unity Android. You can enable this by creating a new Activity which derives from UnityPlayerActivity (UnityPlayerActivity.java can be found at /Applications/Unity/Unity.app/Contents/PlaybackEngines/AndroidPlayer/src/com/unity3d/player on Mac and usually at C:\Program Files\Unity\Editor\Data\PlaybackEngines\AndroidPlayer\src\com\unity3d\player on Windows).

To do this, first locate the classes.jar shipped with Unity Android. It is found in the installation folder (usually C:\Program Files\Unity\Editor\Data (on Windows) or /Applications/Unity (on Mac)) in a sub-folder called PlaybackEngines/AndroidPlayer/bin. Then add classes.jar to the classpath used to compile the new Activity. The resulting .class file(s) should be compressed into a .jar file and placed in the Assets->Plugins->Android folder. Since the manifest dictates which activity to launch it is also necessary to create a new AndroidManifest.xml. The AndroidManifest.xml file should also be placed in the Assets->Plugins->Android folder.

The new activity could look like the following example, OverrideExample.java:

package com.company.product;

import com.unity3d.player.UnityPlayerActivity;

import android.os.Bundle;
import android.util.Log;

public class OverrideExample extends UnityPlayerActivity {

  protected void onCreate(Bundle savedInstanceState) {

    // call UnityPlayerActivity.onCreate()
    super.onCreate(savedInstanceState);

    // print debug message to logcat
    Log.d("OverrideActivity", "onCreate called!");
  }

  public void onBackPressed()
  {
    // instead of calling UnityPlayerActivity.onBackPressed() we just ignore the back button event
    // super.onBackPressed();
  }
} 

And this is what the corresponding AndroidManifest.xml would look like:

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.company.product">
  <application android:icon="@drawable/app_icon" android:label="@string/app_name">
	<activity android:name=".OverrideExample"
			  android:label="@string/app_name"
			  android:configChanges="fontScale|keyboard|keyboardHidden|locale|mnc|mcc|navigation|orientation|screenLayout|screenSize|smallestScreenSize|uiMode|touchscreen">
        <intent-filter>
			<action android:name="android.intent.action.MAIN" />
			<category android:name="android.intent.category.LAUNCHER" />
		</intent-filter>
	</activity>
  </application>
</manifest> 

UnityPlayerNativeActivity

It is also possible to create your own subclass of UnityPlayerNativeActivity. This will have much the same effect as subclassing UnityPlayerActivity but with improved input latency. Be aware, though, that NativeActivity was introduced in Gingerbread and does not work with older devices. Since touch/motion events are processed in native code, Java views would normally not see those events. There is, however, a forwarding mechanism in Unity which allows events to be propagated to the DalvikVM. To access this mechanism, you need to modify the manifest file as follows:-

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.company.product">
  <application android:icon="@drawable/app_icon" android:label="@string/app_name">
	<activity android:name=".OverrideExampleNative"
			  android:label="@string/app_name"
			  android:configChanges="fontScale|keyboard|keyboardHidden|locale|mnc|mcc|navigation|orientation|screenLayout|screenSize|smallestScreenSize|uiMode|touchscreen">
  <meta-data android:name="android.app.lib_name" android:value="unity" />
  <meta-data android:name="unityplayer.ForwardNativeEventsToDalvik" android:value="true" />
        <intent-filter>
			<action android:name="android.intent.action.MAIN" />
			<category android:name="android.intent.category.LAUNCHER" />
		</intent-filter>
	</activity>
  </application>
</manifest> 

Note the ".OverrideExampleNative" attribute in the activity element and the two additional meta-data elements. The first meta-data is an instruction to use the Unity library libunity.so. The second enables events to be passed on to your custom subclass of UnityPlayerNativeActivity.

Examples

Native Plugin Sample

A simple example of the use of a native code plugin can be found here

This sample demonstrates how C code can be invoked from a Unity Android application. The package includes a scene which displays the sum of two values as calculated by the native plugin. Please note that you will need the Android NDK to compile the plugin.

Java Plugin Sample

An example of the use of Java code can be found here

This sample demonstrates how Java code can be used to interact with the Android OS and how C++ creates a bridge between C# and Java. The scene in the package displays a button which when clicked fetches the application cache directory, as defined by the Android OS. Please note that you will need both the JDK and the Android NDK to compile the plugins.

Here is a similar example but based on a prebuilt JNI library to wrap the native code into C#.

Page last updated: 2012-09-25



Android Splash Screen

iOS

Under iOS Basic, a default splash screen will be displayed while your game loads, oriented according to the Default Screen Orientation option in the Player Settings.

Users with an iOS Pro license can use any texture in the project as a splash screen. The size of the texture depends on the target device (320x480 pixels for 1-3rd gen devices, 1024x768 for iPad, 640x960 for 4th gen devices) and supplied textures will be scaled to fit if necessary. You can set the splash screen textures using the iOS Player Settings.

Android

Under Android Basic, a default splash screen will be displayed while your game loads, oriented according to the Default Screen Orientation option in the Player Settings.

Android Pro users can use any texture in the project as a splash screen. You can set the texture from the Splash Image section of the Android Player Settings. You should also select the Splash scaling method from the following options:-

  • Center (only scale down) will draw your image at its natural size unless it is too large, in which case it will be scaled down to fit.
  • Scale to fit (letter-boxed) will draw your image so that the longer dimension fits the screen size exactly. Empty space around the sides in the shorter dimension will be filled in black.
  • Scale to fill (cropped) will scale your image so that the shorter dimension fits the screen size exactly. The image will be cropped in the longer dimension.

Page last updated: 2011-11-08



nacl-gettingstarted

Native Client (NaCl) is a new technology by Google which allows you to embed native executable code in web pages to allow deployment of very performant web apps without requiring the install of plugins. Currently, NaCl is only supported in Google Chrome on Windows, Mac OS X and Linux (with Chrome OS support being worked on), but the technology is open source, so it could be ported to other browser platforms in the future.

Unity 3.5 offers support to run Unity Web Player content (.unity3d files) using NaCl to allow content to be run without requiring a plugin install in Chrome. This is an early release - it should be stable to use, but it does not yet support all features supported in the Unity Web Player, because NaCl is an evolving platform, and does not support everything we can do in a browser plugin.

Building and Testing games on NaCl

Building and testing games on NaCl is very simple. You need to have Google Chrome installed. Simply choose "Web Player" in Build Settings, and tick the "Enable NaCl" checkbox. This will make sure the generated unity3d file can be run on NaCl (by including GLSL ES shaders needed for NaCl, and by disabling dynamic fonts not supported by NaCl), and install the NaCl runtime and a html file to launch the game in NaCl. If you click Build & Run, Unity will install your player as an app in Chrome and launch it automatically.

Shipping Games with NaCl

In its current state, NaCl is not enabled for generic web pages in Chrome by default. While you can embed a NaCl player into any web page, and direct your users to manually enable NaCl in chrome://flags, the only way to currently ship NaCl games and have them work out of the box is to deploy them on the Chrome Web Store (for which NaCl is enabled by default). Note that the Chrome Web Store is fairly unrestrictive, and allows you to host content embedded into your own web site, or to use your own payment processing system if you like. The plan is that this restriction will be lifted when Google has finished a new technology called portable NaCl (PNaCl), which lets you ship executables as LLVM bitcode, thus making NaCl apps independent of any specific CPU architectures. Then NaCl should be enabled for any arbitrary web site.

Notes on Build size

When you make a NaCl build, you will probably notice that the unity_nacl_files_3.x.x folder is very large, over 100 MB. If you are wondering, if all this much data needs to be downloaded on each run for NaCl content, the answer is generally "no". There are two ways to serve apps on the Chrome Web Store, as a hosted or packaged app. If you serve your content as a packaged app, all data will be downloaded on install as a compressed archive, which will then be stored on the user's disk. If you serve your content as a hosted app, data will be downloaded from the web each time. But the nacl runtime will only download the relevant architecture (i686 or x86_64) from the unity_nacl_files_3.x.x folder, and when the web server is configured correctly, the data will be compressed on transfer, so the actual amount of data to be transferred should be around 10 MB (less when physics stripping is used). The unity_nacl_files_3.x.x folder contains a .htaccess file to set up Apache to compress the data on transfer. If you are using a different web server, you may have to set this up yourself.

Limitations in NaCl

NaCl does not yet support all the features in the regular Unity Web Player. Support for many of these will be coming in future versions of Chrome and Unity. Currently, NaCl these features are unsupported by NaCl:

The following features are supported, but have some limitations:

Depth textures are required for real-time shadows and other effects. Depth textures are supported in Unity NaCl, but Chrome's OpenGL ES 2.0 implementation does not support the required extensions on windows, so Depth textures will only work on OS X and Linux.
NaCl uses OpenGL ES 2.0, which does not support all extensions included in the normal OpenGL. This means that some features relying on extensions, such as linear and HDR lighting will not currently work on NaCl. Also Shaders need to be able to compile as GLSL shaders. Currently, not all built-in Unity shaders support this, for instance, the Screen Space Ambient Occlusion is not supported on GLSL.
Cursor locking is supported, but only in fullscreen mode. Cursor locking in windowed mode is planned for a later Chrome release.
NaCl does not have support for hardware exception handling. That means that a NullReferenceException in scripting code results in a crash in NaCl. You can, however pass softexceptions="1" to the embed parameters (set automatically by Unity when building a development player), to tell mono to do checking for NullReferences in software, which results in slower script execution but no crashes.

While Google does not give any system requirements for NaCl other then requiring at least OS X 10.6.7 on the Mac, we've found it to not work very well with old systems - especially when these systems have old GPUs or graphics drivers, or a low amount of installed main memory. If you need to target old hardware, you may find that the Web Player will give you a better experience.

Fullscreen mode:

Fullscreen mode is supported by setting Screen.fullScreen, but you can only enter fullscreen mode in a frame where the user has released the mouse button. NaCl will not actually change the hardware screen resolution, which is why Screen.resolutions will only ever return the current desktop resolution. However, Chrome supports rendering into smaller back buffers, and scaling those up when blitting to the screen. So, requesting smaller resolutions then the desktop resolution is generally supported for fullscreen mode, but will result in GPU based scaling, instead of changing the screen mode.

WWW class:

The WWW class is supported in NaCl, but follows different security policies then the Unity Web Player. While the Unity Web Player uses crossdomain.xml policy files, similar to flash, Unity NaCl has to follow the cross-origin security model followed by NaCl, documented here. Basically, in order to access html documents on a different domain then the player is hosted, you need to configure your web server to send a Access-Control-Allow-Origin respond header for the requests, which allows the domain hosting the player.

Communicating with browser javascript in NaCl

Interacting with the web page using JavaScript is supported, and is very similar to using the Unity Web Player, with one exception: The syntax for sending messages to Unity from html javascript is different, because it has to go through the NaCl module. When you are using the default Unity-generated html, then this code will work:

document.getElementById('UnityEmbed').postMessage("GameObject.Message(parameter)");

Logging

Since NaCl does not allow access to the user file system, it will not write log files. Instead it outputs all logging to stdout. To see the player logs from NaCl:

Page last updated: 2012-09-21



flash-gettingstarted

What is Unity Flash?

The Flash build option allows Unity to publish swf (ShockWave Flash) files. These swf files can be played by a Flash plugin installed into your browser. Most computers in the world will either have a Flash Player installed, or can have one installed by visiting the Adobe Flash website. Just like a WebPlayer build creates a file with your 3d assets, audio, physics and scripts, Unity can build a SWF file. All the scripts from your game are automatically converted to ActionScript, which is the scripting language that the Flash Player works with.

Note that the Unity Flash build option exports SWF files for playback in your browser. The SWF is not intended for playback on mobile platforms.

Performance Comparison

We do not currently have direct comparisons of Unity webplayer content vs Flash SWF content. Much of our webplayer code is executed as native code, so for example, PhysX runs as native code. By comparison, when building a SWF file all of the physics runtime code (collision detection, newtonian physics) is converted to ActionScript. Typically you should expect the SWF version to run more slowly than the Unity webplayer version. We are, of course, doing everything we can to optimize for Flash.

Further reading:

Other Examples:

Useful Resources:

Page last updated: 2012-10-24



flash-setup

Installing Unity for Flash

To view the SWF files that Unity creates, your web browser will need Adobe Flash Player 11.2 or newer, which you can obtain from http://get.adobe.com/flashplayer/. If you have Flash Player already installed, please visit http://kb2.adobe.com/cps/155/tn_15507.html to check that you have at least version 11.2. Adobe Flash Player 11 introduced the Stage 3D Accelerated Graphics Rendering feature that Unity requires for 3d rendering.

For system requirements see http://www.adobe.com/products/flashplayer/tech-specs.html

Flash Player Switcher

This will allow you to switch between debug (slow) and regular (fast) versions of the Flash Player. Ensure you have Adobe AIR installed, or download it from http://get.adobe.com/air/. The Flash Player Switcher can be obtained from: https://github.com/jvanoostveen/Flash-Player-Switcher/downloads (select FlashPlayerSwitcher.air). Note: it currently supports only Mac OS X.

Other Adobe Tools/Platforms

No other Adobe tools or platforms are required to develop with Unity and create SWF files. To embed the SWF that Unity builds into your own Flash Application you will need one of Adobe FlashBuilder/PowerFlasher FDT/FlashDeveloper/etc and be an experienced Flash developer. You will need to know:

Page last updated: 2012-10-24



flash-building

The following is a step-by-step guide to build and run a new project exported to Flash.

  1. Create your Unity content.
  2. Choose File->Build Settings to bring up the Build Settings dialog and add your scene(s).
  3. Change the Platform to Flash Player
  4. Target Player can be left as the default. This option enables you to change the target Flash Player based on the features you require (see http://www.adobe.com/support/documentation/en/flashplayer/releasenotes.html for details).
  5. Tick Development Build. (This causes Unity to not compress the final SWF file. Not compressing will make the build faster, and also, the SWF file will not have to be decompressed before being run in the Flash Player. Note that an empty scene built using the Development Build option will be around 16M in size, compared to around 2M compressed.)
  6. Press the Build button.

Unity will build a SWF file at the location you choose. Additionally it will create the following files:

To view your Flash-built content open the html file. Do not open the SWF file directly.

Build-and-run will create the same files, launch your default browser and load the generated html file.

The embeddingapi.swc file created in the build allows you to load the SWF in your own project. Embedding the Unity content in a standard flash project allows you to do GUI in Flash. This type of Flash integration will of course not work in any of the other build targets.

As with the other build targets, there are Player settings that you can specify. Most of the Flash settings are shared with other platforms. Note that the resolution for the content is taken from the Standalone player settings.

We allow for a Flash API that gives you texture handles, which in combination with the swc embedding will give you means to do webcam, video, vector graphics from flash as textures.


The Build Process

The Unity Flash Publisher attempts to convert scripts from C#/UnityScript into ActionScript. In this process, there can be two kinds of conversion errors:

Errors during conversion will point to the original files and will have the familiar UnityScript error messages with file names and line numbers.

Errors during the compilation of the converted ActionScript will take you to the message in the generated ActionScript code (with filenames ending with .as).


Debugging Converted ActionScript Code

During a build, the converted ActionScript (.as) files are stored within your project folder in:

If you encounter errors with your SWF (at runtime or during a build), it can be useful to look at this converted code.

It is possible that any ActionScript errors at compilation time will not be easily understood. Just remember that the ActionScript is generated from your game script code, so any changes you need to make will be in your original code and not the converted ActionScript files.


Building for a specific Flash Player version

The dropdown box in the build settings window will enable you to choose which Flash Player version you wish to target. This will always default to the lowest supported Flash Player version (currently 11.2) upon creating/reopening your Unity project.

If you wish to build for a specific Flash Player version you can do so by creating an editor script to perform the build for you. In order to do this, you can specify a FlashBuildSubtarget in your EditorUserBuildSettings when building to Flash from an editor script. For example:

EditorUserBuildSettings.flashBuildSubtarget = FlashBuildSubtarget.Flash11dot2;
BuildPipeline.BuildPlayer(..., ..., BuildTarget.FlashPlayer, BuildOptions.Development);


Example Build Errors and Warnings

Below are some common errors/warnings you may encounter when using the Flash export. We also have sections on the Forums and Answers dedicated to Flash export which may be of help if your error is not listed below.

Unable to find Java

Error building Player: Exception: Compiling SWF Failed: Unable to launch Java - is the Java Runtime Environment (JRE) installed?

If you encounter the above error at build time, please install the 32-bit JRE and try again.


'TerrainCollider' is not supported

'TerrainCollider' is not supported when building for FlashPlayer. 
'TerrainData' is not supported when building for FlashPlayer. 
Asset: 'Assets/New Terrain.asset'

The terrain feature is not supported when building for the FlashPlayer target. All un-supported features will generate a similar warning. Note that the build will continue, however, the unsupported feature will be missing from the final SWF.


Unboxing

Error: Call to a possibly undefined method RuntimeServices_UnboxSingle_Object through a reference with static type Class.

This is likely because the conversion between types that is defined on the UnityScript side is not defined for our Flash Publisher. Any time you see an error that refers to Unbox it means a type conversion is required but cannot be found. In order to resolve these issues:


UnauthorizedAccessException

Error building Player: UnauthorizedAccessException: Access to the path "Temp/StagingArea/Data/ConvertedDotNetCode/global" is denied.

If Unity-generated ActionScript files are open in a text editor, Unity may refuse to build issuing this error. To fix this, please close the ActionScript files and allow Unity to overwrite them.

Page last updated: 2012-11-06



flash-debugging

Where can I find my Flash Player log file?

Make sure you've done all of the following:

1) Install "content debugger" version of the Adobe Flash Player plugin from: http://www.adobe.com/support/flashplayer/downloads.html

2) Go to http://flashplayerversion.com/, and make sure that it says 'Debugger: Yes'

3) Be careful using Chrome as it ships with its own Flash Player. If you wish to use Chrome with the debug Flash Player, you can do so by following these instructions: http://helpx.adobe.com/flash-player/kb/flash-player-google-chrome.html

4) Create a file called mm.cfg which will instruct the Flash Player to create a logfile. The mm.cfg file needs to be placed here:

Macintosh OS X/Library/Application Support/Macromedia/mm.cfg
XPC:\Documents and Settings\username\mm.cfg
Windows Vista/Win7C:\Users\username\mm.cfg
Linux/home/username/mm.cfg

Write this text in the mm.cfg file:

ErrorReportingEnable=1
TraceOutputFileEnable=1

5) Find and open your flashlog.txt here:

Macintosh OS X/Users/username/Library/Preferences/Macromedia/Flash Player/Logs/
XPC:\Documents and Settings\username\Application Data\Macromedia\Flash Player\Logs
Windows Vista/Win7C:\Users\username\AppData\Roaming\Macromedia\Flash Player\Logs
Linux/home/username/.macromedia/Flash_Player/Logs/

Note that whilst your content is running this flashlog.txt will constantly be updated as new debug messages are generated by your script code. You may need to reload the file or use an editor that can reload as the file grows in size.

More details about enabling debug logs when using SWFs is available at: http://livedocs.adobe.com/flex/3/html/help.html?content=logging_04.html.

Page last updated: 2012-11-06



flash-whatssupported

Supported


Limited support


Not Currently Supported


Won't be supported


Texture Support

We support jpeg textures, as well as RGBA / Truecolor. Textures which are jpg-xr compressed are not readable and thus not supported.

The compression ratio can be specified in the texture import under 'Override for FlashPlayer' setting. Compressed textures get converted to jpeg with the chosen compression ratio. The compression ratio is worth experimenting with since it can considerably reduce the size of the final SWF.

Texture quality ranges from 0 to 100, with 100 indicating no compression, and 0 the highest amount of compression possible.

The maximum supported texture resolution is 2048x2048.


Unavailable APIs

Page last updated: 2012-11-06



flash-embeddingapi

embeddingapi.swc

If you want to embed your Unity generated Flash content within a larger Flash project, you can do so using the embeddingapi.swc. This SWC provides functionality to load and communicate with Unity published Flash content. In the embeddingapi.swc file, you will find two classes and two interfaces. Each of these, and their available functions, are described below.

When your Unity Flash project is built, a copy of the embeddingapi.swc file will be placed in the same location as your built SWF. You can then use this in your Flash projects as per other SWCs. For more details on what SWCs are and how to use them, see Adobe's documentation.


Stage3D Restrictions

When embedding your Unity Flash content within another Flash project, it is useful to understand the Flash display model. All Stage3D content is displayed behind the Flash Stage. This means that any Flash display list content added to the Stage will always render in front of your 3D content. For more information on this, please refer to Adobe's "How Stage3D Works" page.


IUnityContent

IUnityContent is implemented by Unity built Flash content. This interface is how you communicate with or modify the Untiy content.

Methods:

getTextureFromNativeId(id : int) : TextureBase;Enables retrieving of textures. A full example project using this can be found on the forums.
sendMessage(objectPath : String, methodName : String, value : Object = null) : Boolean;The sendMessage function can be used to call a method on an object in the Unity content.
setContentHost(contentHost : IUnityContentHost) : void;Sets the host (which must implement IUnityContentHost) for the Unity content. The host can then listen for when the Unity content has loaded/started.
setSize(width : int, height : int) : void;Modifies the size of the Unity content
setPosition(x:int = 0, y:int = 0):void;Enables you to reposition the Unity content within the content host.
startFrameLoop() : void;Starts the Unity content.
stopFrameLoop() : void;Stops the unity content.
forceUnload():void;Unloads the Unity flash content.


IUnityContentHost

This must be implemented by whichever class will host the Unity content.

Methods:

unityInitComplete() : void;Called when the Unity engine is done initializing and the first level is loaded.
unityInitStart() : void;Called when the content is loaded and the initialization of the Unity engine is started.


UnityContentLoader

The UnityContentLoader class can be used to load Unity published Flash content and extends the AS3 Loader class. As with standard AS3 Loader instances, you can add event listeners to its contentLoaderInfo in order to know the progress of the load and when it is complete.

Constructor:

UnityContentLoader(contentURL : String, contentHost : IUnityContentHost = null, params : UnityLoaderParams = null, autoLoad : Boolean = true) : void;

Creates a UnityContentLoader instance which you can attach event listeners to and use to load the unity content.

Accessible Properties:

unityContent : IUnityContent;Once the content has finished loading, you can access the Unity content to perform functionality such as sendMessage().

Methods:

loadUnity() : void;Instructs the UnityContentLoader to load the Unity content from the URL supplied in the constructor.
forceUnload() : void;Unloads the unity content from the host.
unload() : void;Overrides the default unload() method of the AS3 Loader class and calls forceUnload.
unloadAndStop(gc:Boolean = true):voidUnloads the unity content then calls the default Loader implementation of unloadAndStop(gc).


UnityLoaderParams

Constructor:

Parameters can be supplied to the UnityContentLoader when created to provide additional loader configuration.

function UnityLoaderParams(scaleToStage : Boolean = false, width : int = 640, height : int = 480, usePreloader : Boolean = false, autoInit : Boolean = true, catchGlobalErrors : Boolean = true) : void;


Example

The following example shows how to load Unity published Flash content into a host SWF. It shows how to supply custom UnityLoaderParams and track progress of the file load. Once the Unity content has been added to the host, a function in the Unity content is called using the sendMessage function.


public class MyLoader extends Sprite implements IUnityContentHost
{
  private var unityContentLoader:UnityContentLoader;

  public function MyLoader()
  {
      var params:UnityLoaderParams = new UnityLoaderParams(false,720,400,false);
      unityContentLoader = new UnityContentLoader("UnityContent.swf", this, params, false);
      unityContentLoader.contentLoaderInfo.addEventListener(ProgressEvent.PROGRESS, onUnityContentLoaderProgress);
      unityContentLoader.contentLoaderInfo.addEventListener(Event.COMPLETE, onUnityContentLoaderComplete);
      unityContentLoader.loadUnity();
  }

  private function onUnityContentLoaderProgress(event:ProgressEvent):void
  {
      //Respond to load progress
  }

  private function onUnityContentLoaderComplete(event:Event):void
  {
     addChild(unityContentLoader);
     unityContentLoader.unityContent.setContentHost(this);
  }

  //unityInitStart has to be implemented by whatever implements IUnityContenthost
  //This is called when the content is loaded and the initialization of the unity engine is started.
  public function unityInitStart():void
  {
    //Unity engine started	
  }

  //unityInitComplete has to be implemented by whatever implements IUnityContenthost
  //This is called when the unity engine is done initializing and the first level is loaded.
  public function unityInitComplete():void
  {
     unityContentLoader.unityContent.sendMessage("Main Camera","SetResponder",{responder:this});
  }

  ...

}

Page last updated: 2012-11-06



flash-adobelicense

What is the license and why is it needed?

When publishing your Unity project to Flash, you will need to acquire a license from Adobe in order for the content to work in the Flash Player. The Adobe documentation of premium features explains why a license is required for Unity built Flash games:

"Premium Features includes the XC APIs (domain memory APIs in combination with Stage3D hardware acceleration APIs), which allows C/C++ developers and other developers using 3rd party tools, including Unity, to target Flash Player for the distribution of their games."

For more information and the latest details on the license, please refer to the Adobe article which explains this in detail.


How do I obtain a license?

To obtain a license, you will need to sign into https://www.adobefpl.com/ using your AdobeId and follow their instructions.


Further reading

Page last updated: 2012-11-06



flashexamples-supplyingdata

If you wish to supply data from Flash to Unity, it must be one of the supported types. You can also create classes to represent the data (by providing a matching C# or JavaScript implementation).

First, create an AS3 implementation of your object and include the class in your project (in an folder called ActionScript):

public class ExampleObject
{
    public var anInt : int;
    public var someString : String;
    public var aBool : Boolean;
}

Now create a C# or JavaScript object which matches the AS3 implementation.

The NotRenamed attribute used below prevents name mangling of constructors, methods, fields and properties.

The NotConverted attribute instructs the build pipeline not to convert a type or member to the target platform. Normally when you build to Flash, each of your C#/JavaScript scripts are converted to an ActionScript (.as) script. Adding the [NotConverted] attribute overrides this process, allowing you to provide your own version of the .as script, manually. The dummy C#/JavaScript which you provide allows Unity to know the signature of the class (i.e. which functions it should be allowed to call), and your .as script provides the implementations of those functions. Note that the ActionScript version will only be used when you build to Flash. In editor or when built to other platforms, Unity will use your C#/JavaScript version.

C#

[NotConverted]
[NotRenamed]
public class ExampleObject
{
    [NotRenamed]
    public int anInt;

    [NotRenamed]
    public string someString;

    [NotRenamed]
    public bool aBool;
}

JavaScript

@NotConverted
@NotRenamed
class ExampleObject
{
    @NotRenamed
    public var anInt : int;

    @NotRenamed
    public var someString : String;

    @NotRenamed
    public var aBool : boolean;
}

Now you need a way in AS3 to retrieve your object, e.g.:

public static function getExampleObject() : ExampleObject
{
    return new ExampleObject();
}

Then you can then retrieve the object and access its data:

ExampleObject exampleObj = UnityEngine.Flash.ActionScript.Expression<ExampleObject>("MyStaticASClass.getExampleObject()");
Debug.Log(exampleObj.someString);

Page last updated: 2012-10-24



flashexamples-callingflashfunctions

This example shows how you can call different AS3 functions from Unity. You will encounter three scripts:

When built to Flash, the AS3 implementation of ExampleClass is used. When run in-editor or built to any platform other than Flash the C#/JavaScript implementation will be used.

By creating an ActionScript version of your classes, this will enable you to use native AS3 libraries when building for Flash Player. This is particularly useful when you need to work around a .net library which isn't yet supported for Flash export.


ExampleClass.as

public class ExampleClass
{
  public static function aStaticFunction() : void
  {
    trace("aStaticFunction - AS3 Implementation");
  }

  public static function aStaticFunctionWithParams(a : int) : void
  {
    trace("aStaticFunctionWithParams - AS3 Implementation");
}

  public static function aStaticFunctionWithReturnType() : int
  {
    trace("aStaticFunctionWithReturnType - AS3 Implementation");
    return 1;
  }

  public function aFunction() : void
  {
    trace("aFunction - AS3 Implementation");
  }
}


ExampleClass - C#/JavaScript Implementation

You can create the class to mimic the AS3 implementation in either C# or JavaScript. The implementations are very similar. Both examples are provided below.

C# Implementation (ExampleClass.cs)

using UnityEngine;

[NotRenamed]
[NotConverted]
public class ExampleClass
{
    [NotRenamed]
    public static void aStaticFunction()
    {
        Debug.Log("aStaticFunction - C# Implementation");
    }

    [NotRenamed]
    public static void aStaticFunctionWithParams(int a)
    {
        Debug.Log("aStaticFunctionWithParams - C# Implementation");
    }

    [NotRenamed]
    public static int aStaticFunctionWithReturnType()
    {
        Debug.Log("aStaticFunctionWithReturnType - C# Implementation");
        return 1;
    }

    [NotRenamed]
    public void aFunction()
    {
        Debug.Log("aFunction - C# Implementation");
    }
}

JavaScript Implementation (ExampleClass.js)

@NotConverted
@NotRenamed
class ExampleClass
{
    @NotRenamed
    static function aStaticFunction()
    {
        Debug.Log("aStaticFunction - JS Implementation");
    }

    @NotRenamed
    static function aStaticFunctionWithParams(a : int)
    {
        Debug.Log("aStaticFunctionWithParams - JS Implementation");
    }

    @NotRenamed
    static function aStaticFunctionWithReturnType() : int
    {
      Debug.Log("aStaticFunctionWithReturnType - JS Implementation");
      return 1;
    }

    @NotRenamed
    function aFunction()
    {
        Debug.Log("aFunction - JS Implementation");
    }
}


How to Call the Functions

The below code will call the methods in the ActionScript (.as) implementation when building for Flash. This will allow you to use native AS3 libraries in your flash export projects. When building to a non-Flash platform or running in editor, the C#/JS implementation of the class will be used.

ExampleClass.aStaticFunction();
ExampleClass.aStaticFunctionWithParams(1);
int returnedValue = ExampleClass.aStaticFunctionWithReturnType();

ExampleClass exampleClass = new ExampleClass();
exampleClass.aFunction();

Page last updated: 2012-11-06



flashexamples-browserjavascriptcommunication

This example shows how AS3 code can communicate JavaScript in the browser. This example makes use of the ExternalInterface ActionScript class.

When run, the BrowserCommunicator.TestCommunication() function will register a callback that the browser JavaScript can then call. The ActionScript will then call out to the browser JavaScript, causing an alert popup to be displayed. The exposed ActionScript function will then be invoked by the JavaScript, completing the two-way communication test.


Required JavaScript

The following JavaScript needs to be added to the html page that serves the Unity published SWF. It creates the function which will be called from ActionScript:

<script type="text/javascript">

function calledFromActionScript()
{
    alert("ActionScript called Javascript function")

    var obj = swfobject.getObjectById("unityPlayer");
    if (obj)
    {
        obj.callFromJavascript();
    }
}

</script> 


BrowserCommunicator.as (and matching C# class)

package
{
    import flash.external.ExternalInterface;
    import flash.system.Security;

    public class BrowserCommunicator
    {
        //Exposed so that it can be called from the browser JavaScript.
        public static function callFromJavascript() : void
        {
            trace("Javascript successfully called ActionScript function.");
        }

        //Sets up an ExternalInterface callback and calls a Javascript function.
        public static function TestCommunication() : void
        {
            if (ExternalInterface.available)
            {
                try
                {
                    ExternalInterface.addCallback("callFromJavascript", callFromJavascript);
                }
                catch (error:SecurityError)
                {
                    trace("A SecurityError occurred: " + error.message);
                }
                catch (error:Error)
                {
                    trace("An Error occurred: " + error.message);
                }

                ExternalInterface.call('calledFromActionScript');
            }
            else
            {
                trace("External interface not available");
            }
        } 
    }
}

C# dummy implementation of the class:

[NotConverted]
[NotRenamed]
public class BrowserCommunicator
{
   [NotRenamed]
   public static void TestCommunication()
   {
   }
}


How to test

Simply call BrowserCommunicator.TestCommunication() and this will invoke the two-way communication test.


Potential Issues

Security Sandbox Violation

A SecurityError occurred: Error #2060: Security sandbox violation

This happens when your published SWF does not have permission to access your html file. To fix this locally, you can either:

For more information on the Flash Security Sandboxes, please refer to the Adobe documentation.

Page last updated: 2012-10-24



flashexamples-accessingthestage

You can access the Flash Stage from your C#/JS scripts in the following way:

ActionScript.Import("com.unity.UnityNative"); 
ActionScript.Statement("trace(UnityNative.stage);");

As an example, the following C# code will output the flashvars supplied to a SWF:

ActionScript.Import("flash.display.LoaderInfo"); 	
ActionScript.Statement(
    "var params:Object = LoaderInfo(UnityNative.stage.loaderInfo).parameters;" +
    "var key:String;" +
    "for (key in params) {" +
        "trace(key + '=' + params[key]);" +
    "}"
);

Page last updated: 2012-11-06



FAQ

以下は、Unity の共通のタスクおよびその実現方法の一覧です。

Page last updated: 2012-11-14



Upgrade guide from 3.5 to 4.0

GameObject active state

Unity 4.0 changes how the active state of GameObjects is handled. GameObject's active state is now inherited by child GameObjects, so that any GameObject which is inactive will also cause its children to be inactive. We believe that the new behavior makes more sense than the old one, and should have always been this way. Also, the upcoming new GUI system heavily depends on the new 4.0 behavior, and would not be possible without it. Unfortunately, this may require some work to fix existing projects to work with the new Unity 4.0 behavior, and here is the change:

The old behavior:

The new behavior:

Example:

You have three GameObjects, A, B and C, so that B and C are children of A.

The new active state in the editor

To visualize these changes, in the Unity 4.0 editor, any GameObject which is inactive (either because it's own .activeSelf property is set to false, or that of one of it's parents), will be greyed out in the hierarchy, and have a greyed out icon in the inspector. The GameObject's own .activeSelf property is reflected by it's active checkbox, which can be toggled regardless of parent state (but it will only activate the GameObject if all parents are active).

How this affects existing projects:

Changes to the asset processing pipeline

During the development of 4.0 our asset import pipeline has changed in some significant ways internal in order to improve performance, memory usage and determinism. For the most part these changes does not have an impact on the user with one exception: Objects in assets are not made persistent until the very end of the import pipeline and any previously imported version of an assets will be completely replaced.

The first part means that during post processing you cannot get the correct references to objects in the asset and the second part means that if you use the references to a previously imported version of the asset during post processing do store modification those modifications will be lost.

Example of references being lost because they are not persistent yet

Consider this small example:

 public class ModelPostprocessor : AssetPostprocessor
 {
     public void OnPostprocessModel(GameObject go)
     {
         PrefabUtility.CreatePrefab("Prefabs/" + go.name, go);
     }
 }
 

In Unity 3.5 this would create a prefab with all the correct references to the meshes and so on because all the meshes would already have been made persistent, but since this is not the case in Unity 4.0 the same post processor will create a prefab where all the references to the meshes are gone, simply because Unity 4.0 does not yet know how to resolve the references to objects in the original model prefab. To correctly copy a modelprefab in to prefab you should use OnPostProcessAllAssets to go through all imported assets, find the modelprefab and create new prefabs as above.

Example of references to previously imported assets being discarded

The second example is a little more complex but is actually a use case we have seen in 3.5 that broke in 4.0. Here is a simple ScriptableObject with a references to a mesh.

 public class Referencer : ScriptableObject
 {
     public Mesh myMesh;	
 }
 

We use this ScriptableObject to create an asset with references to a mesh inside a model, then in our post processor we take that reference and give it a different name, the end result being that when we have reimported the model the name of the mesh will be what the post processor determines.

 public class Postprocess : AssetPostprocessor
 {
 	public void OnPostprocessModel(GameObject go)
 	{
 		Referencer myRef  = (Referencer)AssetDatabase.LoadAssetAtPath("Assets/MyRef.asset", typeof(Referencer));
 		myRef.myMesh.name = "AwesomeMesh";
 	}
 }
 

This worked fine in Unity 3.5 but in Unity 4.0 the already imported model will be completely replaced, so changing the name of the mesh from a previous import will have no effect. The Solution here is to find the mesh by some other means and change its name. What is most important to note is that in Unity 4.0 you should ONLY modify the given input to the post processor and not rely on the previously imported version of the same asset.

Page last updated: 2012-10-29



Upgrade guide from 3.4 to 3.5

If you have an FBX file with a root node marked up as a skeleton, it will be imported with an additional root node in 3.5, compared to 3.4

Unity 3.5 does this because when importing animated characters, the most common setup is to have one root node with all bones below and a skeleton next to it in the hierarchy. When creating additional animations, it is common to remove the skinned mesh from the fbx file. In that case the new import method ensures that the additional root node always exists and thus animations and the skinned mesh actually match.

If the connection between the instance and the FBX file's prefab has been broken in 3.4 the animation will not match in 3.5, and as a result your animation might not play.

In that case it is recommended that you recreate the prefabs or Game Object hierarchies by dragging your FBX file into your scene and recreating it.

Page last updated: 2012-02-04



HowToUpgradeFrom2xTo3x

Unity の通常のポイント リリースでは、最初に新しいエディタで開いた際に、プロジェクトは、同じメジャーなバージョンの前のマイナーなバージョンから自動的にアップグレードされます。 新しいプロパティにはデフォルト値が与えられ、形式は変換されます。 しかし、2.x から 3.x などのメジャーなバージョン変更の場合、いくつか下位互換性を壊す変更が導入されます。

これは、前のバージョン出作成された内容は新しいエンジンで実行時に比べ、若干異なって再生される点で、主に確認できますが、一部の変更では、希望通りに再生するには、多くの微調整が必要になります。 これらの文書では、!2.x から 3.x への変更の概要を記載しています。

Page last updated: 2012-11-09



PhysicsUpgradeDetails

Unity 3.0 向けに、NVIDIA PhysX ライブラリをバージョン 2.6 から 2.8.3 にアップグレードしており、多くの新しい機能を利用できます。 一般に、既存のプロジェクトの場合、動作はほぼ Unity 2.x と同じですが、物理特性シミュレーションの結果に僅かな違いがあるため、内容が正確な動作または物理特性イベントの連鎖に依存している場合、Unity 3.x で期待されるように、設定を最調整する必要がある場合があります。

Configurable Joints を使用している場合、JointDrive.mode が JointDriveMode.Position の場合、JointDrive.maximumForce プロパティも考慮に入れる必要がある。 この値をデフォルトの 0 に設定した場合、ジョイントは力を加えません。 JointDrive.mode が JointDriveMode.Position の場合、古いバージョンからインポートされた JointDrive プロパティをすべてを自動的に変更しますが、コードからジョイントを設定する時に、これを手動で変更する必要がある場合があります。 また、JointDrive.maximumForce のデフォルト値を無限に変更しています。

Page last updated: 2012-11-09



MonoUpgradeDetails

In Unity 3 we upgraded the mono runtime from 1.2.5 to 2.6 and on top of that, there are some JavaScript and Boo improvements. Aside from all bug fixes and improvements to mono between the two versions, this page lists some of the highlights.

C# Improvements

Basically the differences betweeen C# 3.5 and C# 2.0, including:

JavaScript Improvements

		var list = new System.Collections.Generic.List.<String>();
		list.Add("foo");
		list.Sort(function(x:String, y:String) {
			return x.CompareTo(y);
		});
		list.Sort(function(x, y) x.CompareTo(y));
		function forEach(items, action: function(Object)) {
			for (var item in items) action(item);
		}
		function printArray(a: int[]) {
			print("[" + String.Join(", ", [i.ToString() for (i in a)]) + "]");
		}

		var doubles = [i*2 for (i in range(0, 3))];
		var odds = [i for (i in range(0, 6)) if (i % 2 != 0)];
		printArray(doubles);
		printArray(odds);
		interface IFoo {
			function bar();
		}

		class Foo implements IFoo {
			function bar() {
				Console.WriteLine("Foo.bar");
			}
		}
		final function foo() {
		}
		class Pair extends System.ValueType {
			var First: Object;
			var Second: Object;

			function Pair(fst, snd) {
				First = fst;
				Second = snd;
			}

			override function ToString() {
				return "Pair(" + First + ", " + Second + ")";
			}
		}

Boo Improvements

Page last updated: 2011-11-09



RenderingUpgradeDetails

Unity 3 brings a lot of graphics related changes, and some things might need to be tweaked when you upgrade existing Unity 2.x projects. For changes related to shaders, see Shader Upgrade Guide.

Forward Rendering Path changes

Unity 2.x had one rendering path, which is called Forward in Unity 3. Major changes in it compared to Unity 2.x:

Shader changes

See Shader Upgrade Guide for more details. Largest change is: if you want to write shaders that interact with lighting, you should use Surface Shaders.

Obscure Graphics Changes That No One Will Probably Notice TM

Page last updated: 2010-09-26



SL-V3Conversion

Unity 3 has many new features and changes to its rendering system, and ShaderLab did update accordingly. Some advanced shaders that were used in Unity 2.x, especially the ones that used per-pixel lighting, will need update for Unity 3. If you have trouble updating them - just ask for our help!

For general graphics related Unity 3 upgrade details, see Rendering Upgrade Details.

When you open your Unity 2.x project in Unity 3.x, it will automatically upgrade your shader files as much as possible. The document below lists all the changes that were made to shaders, and what to do when you need manual shader upgrade.

Per-pixel lit shaders

In Unity 2.x, writing shaders that were lit per-pixel was quite complicated. Those shaders would have multiple passes, with LightMode tags on each (usually PixelOrNone, Vertex and Pixel). With addition of Deferred Lighting in Unity 3.0 and changes in old forward rendering, we needed an easier, more robust and future proof way of writing shaders that interact with lighting. All old per-pixel lit shaders need to be rewritten to be Surface Shaders.

Cg shader changes

Built-in "glstate" variable renames

In Unity 2.x, accessing some built-in variables (like model*view*projection matrix) was possible through built-in Cg names like glstate.matrix.mvp. However, that does not work on some platforms, so in Unity 3.0 we renamed those built-in variables. All these replacements will be done automatically when upgrading your project:

Semantics changes

Additionally, it is recommended to use SV_POSITION (instead of POSITION) semantic for position in vertex-to-fragment structures.

More strict error checking

Depending on platform, shaders might be compiled using a different compiler than Cg (e.g. HLSL on Windows) that has more strict error checking. Most common cases are:

Other Changes

RECT textures are gone

In Unity 2.x, RenderTextures could be not power of two in size, so called "RECT" textures. They were designated by "RECT" texture type in shader properties and used as samplerRECT, texRECT and so on in Cg shaders. Texture coordinates for RECT textures were a special case in OpenGL: they were in pixels. In all other platforms, texture coordinates were just like for any other texture: they went from 0.0 to 1.0 over the texture.

In Unity 3.0 we have decided to remove this OpenGL special case, and treat non power of two RenderTextures the same everywhere. It is recommended to replace samplerRECT, texRECT and similar uses with regular sampler2D and tex2D. Also, if you were doing any special pixel adressing for OpenGL case, you need to remove that from your shader, i.e. just keep the non-OpenGL part (look for SHADER_API_D3D9 or SHADER_API_OPENGL macros in your shaders).

Page last updated: 2010-09-26



Unity 4.x Activation - Overview

What is the new Activation system?

With our new Licensing System, we allow you, the user, to manage your Unity license independently. Contacting the Support Team when you need to switch machine is a thing of the past! The system allows instant, automated migration of your machine, with a single click. Please read our 'Managing your Unity 4.0 License' link for more information.

http://docs.unity3d.com/Documentation/Manual/ManagingyourUnity4xLicense.html

If you're looking for step-by-step guides to Activation of Unity, please see the child pages.

FAQ

How many machines can I install my copy of Unity on?

Every paid commercial Unity license allows a *single* person to use Unity on *two* machines that they have exclusive use of. Be it a Mac and a PC or your Home and Work machines. Educational licenses sold via Unity or any one of our resellers are only good for a single activation. The same goes for Trial licenses, unless otherwise stated.

The free version of Unity may not be licensed by a commercial entity with annual gross revenues (based on fiscal year) in excess of US$100,000, or by an educational, non-profit or government entity with an annual budget of over US$100,000.

If you are a Legal Entity, you may not combine files developed with the free version of Unity with any files developed by you (or by any third party) through the use of Unity Pro. Please see our EULA http://unity3d.com/company/legal/eula for further information regarding license usage.


I need to use my license on another machine, but I get that message that my license has been 'Activated too many times'. What should I do?

Youll need to 'Return' your license. This enables you to return the license on the machine you no longer require, which in turn enables you to reactivate on a new machine. Please refer to the 'Managing your Unity 4.0 License' link at the top of the page, for more information.


My account credentials arent recognised when logging in during the Activation process?

Please ensure that your details are being entered correctly. Passwords ARE case sensitive, so ensure youre typing exactly as you registered. You can reset your password using the link below:

https://accounts.unity3d.com/password/new

If youre still having issues logging in, please contact 'support@unity3d.com'


Can I use Unity 4.x with my 3.x Serial number?

No, you cant. In order to use Unity 4.x, youll need to upgrade to a 4.x license. You can do this Online, via our Web Store. https://store.unity3d.com/shop/


Im planning on replacing an item of hardware and/or my OS. What should I do?

As with changing machine, youll need to 'Return' your license before making any hardware or OS changes to your machine. If you fail to Return the license, our server will see a request from another machine and inform you that youve reached your activation limit for the license. Please refer to the 'Managing your Unity 4.0 License' link at the top of the page, for more information regarding the return of a license.


My machine died without me being able to 'Return' my license, what now?

Please email 'support@unity3d.com' explaining your situation, including the details below.

 - The Serial number you were using on the machine.
 - The (local network) name of the machine that died

The Support Team will then be able to 'Return' your license manually.


I have two licenses, each with an add-on I require, how do I activate them in unison on my machine?

You cant, unfortunately! A single license may only be used on one machine at any one time.


Where is my Unity 4.x license file stored?

- /Library/Application Support/Unity/Unity_v4.x.ulf (OS X)

- C:\ProgramData\Unity (Windows)

For any further assistance, please contact support@unity3d.com.

Page last updated: 2012-11-26



Managing your Unity 4.x License

With Unity 4.0 you are now able to manage your license independently (no more contacting Support for migration to your shiny new machine). Below is a guide to how this new system works and performs.


You will notice a new option under the 'Unity' drop-down on your toolbar that reads 'Manage License'. This is the unified place within the Editor for all your licensing needs.

Once you have clicked on the 'Manage License' option you will be faced with the 'License Management' window. You then have four options (see image), explained below:

'Check for updates' cross-references the server, querying your Serial number for any changes that may have been made since you last activated. This is handy for updating your license to include new add-ons once purchased and added to your existing license via the Unity Store.


'Activate a new license' does what it says on the tin. This enables you to activate a new Serial number on the machine youre using.


The 'Return license' feature enables you to return the license on the machine in question, in return for a new activation that can be used on another machine. Once clicked the Editor will close and you will be able to activate your Serial number elsewhere. For more information on how many machines a single license enables use on, please see our EULA: http://unity3d.com/company/legal/eula.


'Manual activation' enables you to activate your copy of Unity offline. This is covered in more depth here: http://docs.unity3d.com/Documentation/Manual/ManualActivationGuide.html.



For any further assistance, please contact support@unity3d.com.

Page last updated: 2012-11-26



Online Activation Guide

Online activation is the easiest and fastest way to get up and running with Unity. Below is a step-by-step guide on how to activate Unity online.


1. Download and install the Unity Editor. The latest version of Unity can be found at http://unity3d.com/unity/download/


2. Fire up the Editor from your Applications folder on OS X or the shortcut in the Start Menu on Windows.


3. You will be faced with a window titled 'Choose a version of Unity', you will then need to select the version of Unity you wish to activate by checking the tick box of the appropriate option and clicking 'OK' to proceed.

a. To activate an existing Unity 4.x Serial number generated by the Store or a member of our Sales Team, check the 'Activate an existing serial' box and enter the appropriate Serial number. Once the Serial number has been entered your license Type will be displayed on-screen.
b. To Trial Unity Pro for 30 days Free-Of-Charge, check the 'Activate your free 30-day Unity Pro trial' box.
c. To activate the Free version of Unity, check the 'Activating Unity Free' box.


4. Next, you will encounter the 'Unity Account' window. Here you will need to enter your Unity Developer Network account credentials. (If you dont have an existing account or have forgotten your password, simply click the respective 'Create account' and 'Forgot your password?' button and links. Follow the onscreen prompts to create or retrieve your account.) Once your credentials are entered you can proceed by clicking 'OK'.


5. 'Thank you for your time' you will now be able to proceed to the Unity Editor by clicking the 'Start using Unity' button.


6. Youre all done!


For any further assistance, please contact support@unity3d.com.

Page last updated: 2012-11-28



Manual Activation Guide

With our new Licensing System, the Editor will automatically fall back to manual activation if Online Activation fails, or if you dont have an internet connection. Please see the steps below for an outline on how to manually Activate Unity 4.0.


1. As above, Unity will fall back to Manual Activation, should the Online Activation fail. However, you can manually prompt Unity to start the Manual Activation procedure by navigating to 'Unity>Manage License' within the Editor.

2. In the 'License Management' window, hit the 'Manual activation' button.

3. You should now be faced with a dialog displaying three buttons:

a. 'Cancel' will take you back to the 'License Management' window.
b. 'Save License' will generate you a license file specific to your machine, based on your HWID. This file can be saved in any location on your physical machine.
c. 'Load License' will load the activation file generated by the Manual Activation process.


4. You will need to generate a license file; in order to do this, click the Save License button. Once clicked you will be faced with the window 'Save license information for offline activation'. Here you can select a directory on your machine to save the file.

5. Once saved, you will receive a message stating that 'License file saved successfully'. Click 'Ok' to proceed.

6. Now, youll need to minimise the Editor and navigate over to https://license.unity3d.com/manual within your Browser (if on a machine without an internet connection, youll need to copy the file to a machine that does and proceed there).

7. You now need to navigate to the file you generated in Step 4, uploading it in the appropriate field. When your file has been selected, click 'OK' to proceed. Attach:manualActivationwebpage.png Δ

8. Nearly done! You should have received a file in return, as with Step 4, save this to your machine in a directory of your choice.

9. Moving back into Unity, you can now select the 'Load License' button. Again, this will open up your directories within your hard drive. Now, select the file that you just saved via the Web form and click 'OK'.

10. Voila, you've just completed the Manual Activation process.


For any further assistance, please contact support@unity3d.com.

Page last updated: 2012-11-28



Game Code How-to

Page last updated: 2012-11-13



HOWTO-First Person Walkthrough

自身の作品での簡単な一人称ウォークスルーの作成方法について説明します。

  1. レベルをインポートします。 Unity のアート パッケージからジオメトリをインポートする方法については、here を参照してください。
  2. インポートしたモデル ファイルを選択し、InspectorImport Settings の「Generate Colliders」を有効にします。
  3. Project ViewStandard Assets->Prefabs->First Person Controller を特定して、Scene View にドラッグします。
  4. レベルのスケールが正しいか確認します。 一人称コントローラはちょうど 2 メートルの高さになりますが、レベルがコントローラのサイズに合わない場合、モデリング アプリケーション内でレベルのスケールを調整する必要があります。 スケールを正しく設定することは、物理的シミュレーションおよび this page の下部で文書化されているその他の理由のために重要です。 間違えたスケールを使用すると、オブジェクトが浮いていたり、重すぎたりするように感じます。 モデリング アプリケーションでスケールを変更できない場合、モデル ファイルの Import Settings... を変更できます。
  5. Transform ハンドルを使用して開始位置に一人称コントローラを移動します。 ゲーム開始時に、一人称コントローラはレベル ジオメトリと交差しないことが重要です (そうでない場合、行き詰まります)。
  6. 階層ビューでデフォルトのカメラ「Main Camera」を削除します。 一人称コントローラにはすでにそれ自身のカメラがあります。
  7. Play を押して、レベルで動き回ります。

Page last updated: 2012-11-09



Graphics how-tos

以下は、Unity の共通のグラフィック関連の質問およびその実現方法の一覧です。

色や突起、スペキュラ、反射マッピングなどのテクスチャのための素晴らしいチュートリアルは、ここにあります

Page last updated: 2012-11-13



HOWTO-alphamaps

Unity はストレートな alpha blending を使用します。 従って色レイヤーを拡張する必要があります。 Unity でのアルファ チャンネルが Photoshop ファイルの最初のアルファ チャンネルから読み込まれます。

設定

設定を行う前に、これらのアルファ ユーティリティ Photoshop アクションをインストールします。 AlphaUtility.atn.zip

インストール後には、アクション パレットに AlphaUtility と呼ばれるフォルダが含まれます。

アルファを正しく設定

Photoshop 内の透明なレイヤーにアルファ テクスチャがあるとします。 次のようになります。

  1. レイヤーをコピーします。
  2. 最も下のレイヤーを選択します。 これは、バックグラウンドの拡張の元になります。
  3. Layer->Matting->Defringe を選択し、デフォルトのプロパティを適用させます。
  4. Dilate Manyアクションを数回実行します。 これにより、背景が新しいレイヤーに展開されます。
  5. すべての拡張レイヤーを選択し、Command-E で結合します。
  6. 画像スタックの下部に無色のレイヤーを作成します。 これは、文書の一般的な色に一致します (この場合、緑っぽい色)。注意:レイヤーを使わないと、Unityは全てのレイヤーをマージしたアルファを使います。

アルファ レイヤーに透明さをコピーする必要があります。

  1. 選択内容をレイヤー パレットでコマンド クリックすることで、メイン レイヤーの内容に設定します。
  2. チャンネル パレットに切り替えます。
  3. 透明さから新しいチェンネルを作成します。

PSD ファイルを保存します。これで準備ができました。

注意

もし(レイヤーをマージした後)透明を含む画像であれば、Unityは全レイヤーからマージされた透明部からアルファの方を使い、アルファマスクは無視します。 これの回避方法としては、「アルファ取得用」として Item 6 を無色レイヤーを作ることです。

Page last updated: 2012-11-13



HOWTO-Normalmap

Normal maps は、凹凸のある表面の外観を作成するために、オブジェクト上の高さマップとして使用するグレー スケールがそうです。 次のようなモデルがあるとします。


「3D モデル」

「テクスチャ」

オブジェクトの明るい部分を隆起させたいとします。

  1. Photoshop でテクスチャのグレー スケール高さマップを描画します。 白が高い部分で、黒が低い部分です。 次のようになります。
  2. メイン テクスチャの隣の画像を保存します。
  3. Unity で、24 ビット RGB 形式を選択し、InspectorImport Settings の「Generate Normal Map」を有効にします。


  1. モデルの Material Inspector で、シェーダ ドロップダウンから「Bumped Diffuse」を選択します。
  2. プロジェクト ウィンドウからのテクスチャを「Normalmap」テクスチャ スロットにドラッグします。

これでオブジェクトが法線マップに適用されました。

ヒント

Page last updated: 2012-11-09



HOWTO-UseDetailTexture

Detail texture は、表面に近づくに連れ、フェードインする小さい、微細パターンで、木目や石の不完全さ、地形における地表の細かい部分などです。 このテクスチャは、明示的に Diffuse Detail shader と併用されます。

詳細テクスチャは、全方向にタイルを貼る必要があります。 0-127 の色値は、オブジェクトをより暗くし、128 は何も変わらず、より明るい色だと、オブジェクトをより明るくします。 画像を 128 辺りに中央に配置することが非常に重要です。そうでない場合、適用されるオブジェクトは、近づくに連れ、より明るくなるか暗くなります。

  1. 詳細テクスチャのグレースケール画像を描画または発見します。

    「詳細テクスチャ」


    「レベル」
  2. メイン テクスチャの隣の画像を保存します。
  3. Unity で、画像を選択し、「Generate Mip Maps」の下で、「Fades Out」を有効にし、スライダを InspectorImport Settings にある類似の内容に設定します。
  4. 上のスライダは、フェードアウトし始める前にテクスチャの小ささを決定し、下のスライダは詳細テクスチャが完全に消える前の離れる距離を決定します。


    s.
  5. 右側の Material Inspector で、シェーダ ドロップダウンから Diffuse Detail を選択します。
  6. Project View からテクスチャを「Detail」テクスチャ スロットにドラッグします。
  7. 「Tiling」値を高い値に設定します。

Page last updated: 2012-11-09



HOWTO-MakeCubemap

Cubemaps は、Reflective built-in shaders で使用されます。 これを作成するには、6 つの 2D テクスチャを作成し、キューブマップ アセットを新規作成するか、1 つの正方形テクスチャからキューブマップを作成します。 詳細については、Cubemap Texture 文書ページを参照してください。

スタティックおよびダイナミック キューブマップの反射もスクリプトからレンダリングできます。 Camera.RenderToCubemap ページのコード例には、エディタから直接 キューブマップをレンダリングするための簡単なウィザード スクリプトが含まれています。

Page last updated: 2012-11-09



HOWTO-UseSkybox

Skybox は、ゲーム内のすべてのグラフィックの裏で描画される6 面体のキューブです。 以下が作成方法になります。

  1. スカイボックスの 6 面のそれぞれに対応した 6 つのテクスチャを作成し、プロジェクトの Assets フォルダに入れます。
  2. 各テクスチャに対して、ラップ モードを「Repeat」 から「Clamp」に設定する必要があります。 これを行わない場合は、縁の色が調和しません。
  3. Material を新規作成するには、メニューバーから Assets->Create->Material を選択します。
  4. Inspector の上部にあるシェーダ ドロップダウンを選択し、RenderFX->Skybox を選択します。
  5. マテリアル内の各テクスチャ スロットに 6 つのテクスチャを割り当てます。 これを行うには、Project View から各テクスチャを対応するスロットにドラッグします。

スカイボックスを作業中のシーンに割り当てるには、

  1. メニューバーから Edit->Render Settings を選択します。
  2. 新しいスカイボックス マテリアルをインスペクタの「Skybox Material」スロットにドラッグします。

Standard Assets パッケージには、すぐに使えるスカイボックスが幾つか含まれています。これが、最も速く始める方法です。

Page last updated: 2012-11-09



HOWTO-MeshParticleEmitter

Mesh Particle Emitters は一般に、粒子を放出する場所に対する高いコントロールが必要な場合に使用されます。

例えば、炎の剣などを作成したい場合などです。

  1. メッシュをシーンにドラッグします。
  2. 「Mesh Renderer」の Inspector タイトル バーを右クリックして、「Mesh Renderer」を削除し、「Remove Component」を選択します。
  3. Component->Effects->Legacy Particles メニューから、Mesh Particle Emitter を選択します。
  4. Component->Effects->Legacy Particles メニューから、Particle Animator を選択します。
  5. Component->Effects->Legacy Particles メニューから、Particle Renderer を選択します。

メッシュから放出される粒子が確認できます。

Mesh Particle Emitter の値を試してみてください。

特に、メッシュ粒子エミッタのインスペクタで「Interpolate Triangles」を有効にし、「Min Normal Velocity」と「Max Normal Velocity」を 1 に設定します。

放出される粒子の見た目をカスタマイズするには、

  1. メニューバーから Assets->Create->Material を選択します。
  2. マテリアル インスペクタで、シェーダ ドロップダウンから Particles->Additive を選択します。
  3. Project View からテクスチャをマテリアル インスペクタのテクスチャ スロットにドラッグします。
  4. Scene View で、プロジェクト ビューからマテリアルを粒子システムにドラッグします。

メッシュから放出されるテクスチャ化粒子が確認できます。

以下も併せて参照してください。

Page last updated: 2012-11-13



HOWTO-SplashScreen

デスクトップ!

%div デスクトップ apply=div%

Unity でのスプラッシュ画面またはその他の全画面画像の作成方法について記載します。 この方法は、複数の解像度およびアスペクト比に対して機能します。

  1. 最初に大きいテクスチャが必要になります。 サイズが 2 のべき乗のテクスチャが理想的です。 例えば、ほとんどの画面にフィットするように、1024x512 を使用してもよいでしょう。
  2. メニューバー項目の GameObject->Create Other->Cube を使用して、箱を新規作成します。
  3. スケールの最初の 2 つの値として、16 と 9 を入力して、箱のサイズを 16:9 形式になるよう縮小拡大します。
  4. テクスチャを立方体にドラッグし、Camera がその立方体を向くようにします。 この立方体が 16:9 nアスペクト比で表示されるような距離にカメラを置きます。 Scene View メニューバーで Aspect Ratio Selector を使用して、最終結果を確認します。

iOS

Android

Page last updated: 2012-11-09



HOWTO-LightCookie

Unity は、Standard Assets にいくつかの Light Cookies を同梱しています。 プロジェクトに標準アセットをプロジェクトにインポートする場合、Standard Assets->Light Cookies にあります。 このページには、自身で作成する方法が記載されています。

Cookie を使用してシーンに数々の視覚的詳細を追加する便利な方法は、ゲーム内のライティングの正確な見た目を制御するのにグレースケール テクスチャを使用する方法です。 これは、移動する雲の作成や、密集する葉の印象を与えるのに便利です。 Light Component Reference page にこれに関する詳細が全て記載されていますが、テクスチャを Cookie に対して使用可能にするには、次のプロジェクトを設定する必要があります。

スポット ライトにライト Cookie を作成するには、

  1. Photoshop でCookieテクスチャをペイントします。 画像はグレースケールにする必要があります。 白のピクセルは、完全な照明強度で、黒いピクセルは照明なしになります。 テクスチャの境界は完全に黒にする必要があります。そうでない場合、スポット ライトの外部から光が漏れているような見かけになります。
  2. Texture Inspector で、「Repeat」 ラップ モードを「Clamp」に設定する必要があります。

テクスチャを選択し、Inspector で次の Import Settings を編集します。

  1. 「Border Mipmaps」を有効にします。
  2. 「Build Alpha From Grayscale」を有効にします (このように、グレースケール Cookie を作成でき、Unity がこの Cookie をアルファ マップに自動的に変換します)
  3. テクスチャ形式を「Alpha 8 Bit」に設定します。

Page last updated: 2012-11-09



HOWTO-FixZAxisIsUp

3D アート パッケージの中には、Z 軸が上を向くように、モデルをエクスポートします。 Unity の標準スクリプトは、Y 軸は 3D ワールドで up を表すことを前提としています。 通常、Unity での回転の固定は、ものがフィットするようにスクリプトを修正するより簡単に行えます。

Z 軸のあるモデルは上を指します

可能な限り、エクスポート前に、Y 軸に上を向かせるための 3D モデリング アプリケーションでモデルを固定することをお勧めします。

不可能な場合は、さらに親トランスフォームを追加することで、Unity でモデルを固定できます。

  1. GameObject->Create Empty メニューを使用して、空の GameObject を作成します。
  2. メッシュの中心にくるよう、またはオブジェクトを回転させたい点に関係なく、新しい GameObject を配置します。
  3. 空の GameObject にメッシュをドラッグします。

これで、メッシュを正しい方向を向いている空の GameObject の Child にすることができました。 Y 軸を上として使用するスクリプトをいつ記述しても、そのスクリプトを Parent である空の GameObject に追加します。

追加の空のトランスフォームを持つモデル

Page last updated: 2012-11-09



HOWTO-Water

ウォータをどのように使用しますか?

注意: 本ページの内容は、デスクトップ エディタ モードにのみ適用されます。

Unity には、Standard Assets and Pro Standard Assets packages 内に、いくつかのウォータ プレハブ (必要なシェーダ、スクリプトおよびアート アセットを含む) を含んでます。 Unity には、基本的なウォータを含んでいますが、Unity Pro は、リアルタイムの反射や屈折を持つウォータを含んでいます。いずれも、個々のデイライトおよびナイトタイム ウォータ プレハブとして提供されています。

「反射するデイライト ウォータ (Unity Pro)」

「反射/屈折するデイライト ウォータ (Unity Pro)」

ウォータ設定

大半の場合、既存のプレハブの 1 つをシーンに置く必要があります (make sure to have the Standard Assets をインストールすること)。

プレハブは、ウォータに楕円状のメッシュを使用します。 別の Mesh を使用したい場合は、メッシュをウォータ オブジェクトの「Mesh Filter」に変更するだけで簡単に使用できます。

ゼロからのウォータの作成 (詳細)

Unity におけるシンプルなウォータは、スクリプトの平面上メッシュへの追加とウォータ シェーダの使用が必要です。

  1. ウォータにメッシュを設定します。 これは、水平に配置された、平面メッシュになります。 UV 座標は不要です。 ウォータ GameObject は、Inspector で設定できる、ウォータの「layer」を使用します。
  2. 「WaterSimple」スクリプトを (Standard Assets/Water/Sources から) オブジェクトに追加します。
  3. FX/Water (simple) シェーダをマテリアルで使用するか、提供されたウォータ マテリアルを微調整します (Daylight Simple Water または Nighttime Simple Water)。

ゼロから設定するため、Unity Pro の反射/屈折ウォータには、同様のステップが必要です。

  1. ウォータにメッシュを設定します。 これは、水平に配置された、平面メッシュになります。 UV 座標は不要です。 ウォータ GameObject は、Inspector で設定できる、ウォータの「layer」を使用します。
  2. 「Water」スクリプトを (Pro Standard Assets/Water/Sources から) オブジェクトに追加します。
    • 次のウォータ レンダリング モードを、インスペクタで設定できます。 シンプル、反射、屈折。
  3. FX/Water (simple) シェーダをマテリアルで使用するか、提供されたウォータ マテリアルを微調整します (Daylight Water または Nighttime Water)。

ウォータ マテリアルのプロパティ

これらのプロパティは、反射および屈折ウォータ シェーダで使用します。 その大半は、シンプル ウォータ シェーダでも使用されます。

Wave scaleウォータの法線マップのスケーリング。 この値が小さいほど、ウォータは大きく揺れます。
Reflection/refraction distort波の法線マップで歪ませる反射および屈折の程度。
Refraction color屈折に対する追加の色合い。
Environment reflection/refractionリアルタイムの反射および屈折に対するテクスチャをレンダリングします。
Normalmap波の形状を定義します。 最後の波は、これら 2 つの法線マップを結合することで生成され、それぞれが、異なる方向、スケール、速度でスクロールします。 2 つ目の法線マップは、1 つ目の法線マップの半分の大きさです。
Wave speed1 つ目の法線マップ (1 番、2 番) と 2 つ目の法線マップの (3 番、4 番) のスクロール速度
Fresnelフレネル効果をコントロールするアルファ チャンネルのあるテクスチャ。表示角度に基づく、反射と屈折の表示量。

残りのプロパティは、反射および屈折シェーダ自体によって使用されませんが、ユーザーのビデオ カードがサポートしておらず、よりシンプルなシェーダにフォールバックする必要がある場合は設定する必要があります。

Reflective color/cube and fresnel表示角度に基づいた、ウォータの色 (RGB) およびフレネル効果 (A) を定義するテクスチャ。
Horizon color水平線でのウォータの色。 (シンプル ウォータ シェーダでのみ使用されます)。
Fallback textureもっと見た目のよいシェーダを実行できない場合に、非常に古いビデオ カードでウォータを表すテクスチャ。

ハードウェア サポート

Page last updated: 2012-11-09



HOWTO-exportFBX

Unity supports FBX files which can be generated from many popular 3D applications. Use these guidelines to help ensure the most best results.

Select > Prepare > Check Settings > Export > Verify > Import

What do you want to export? - be aware of export scope e.g. meshes, cameras, lights, animation rigs, etc. -

What do you need to include? - prepare your assets:

How do I include those elements? - check the FBX export settings

Which version of FBX are you using? if in doubt use 2012.2

Will it work? - Verify your export

Import!

See below for Maya FBX dialogue example:

Fig 1 General, Geometry & Animation

Fig 2 Lights, Advanced options

Page last updated: 2012-11-26



HOWTO-ArtAssetBestPracticeGuide

Unity supports textured 3D models from a variety of programs or sources. This short guide has been put together by games artists with developers at Unity, to help you create assets that work better and more efficiently in your Unity project.

Scale & Units

Files & Objects

Sensibly named objects help you find stuff quickly

Mesh

Stairway to framerate heaven

The method you use to construct objects can have a massive affect on the number of polygons, especially when not optimised. Observe the same shape mesh : 156 triangles (right) vs 726 (left). 726 may not sound like a great deal of polygons, but if this is used 40 times in a level, you will really start to see the savings. A good rule of thumb is often to start simple and add detail where needed. Its always easier to add polygon than take them away.

Textures

Textures are more efficient and dont need rescaling at build time if authored to specific texture sizes e.g a power of two up to 40964096 pixels, e.g. 512512 or 2561024 etc. (20482048 is the highest on many graphics cards/platforms) there is lots of expertise online for creating good textures, but some of these guidelines can help you get the most efficient results from your project:

1 texture (left) vs 3 textures (right)

Tiling textures ftw

Do you need ALL those windows?

Materials

Import/Export

Unity can use two types of files: Saved 3D application files and Exported 3D formats which you decide to use can be quite important:

Saved application files

Unity can import, through conversion: Max, Maya, Blender, Cinema4D, Modo, Lightwave & cheetah3D files, e.g. .MAX, .MB, .MA etc. see more in Importing Objects

Advantages:

Disadvantages:

Exported 3D formats

Unity can also read FBX, OBJ, 3DS, DAE & DXF files for a general export guide you can refer to this section this section

Advantages:

Disadvantages:

Page last updated: 2012-11-26



HOWTO-importObject

Unity では、最も一般的な 3D アプリケーションからのインポートをサポートします。 以下から作業しているアプリケーションを選択します。

その他のアプリケーション

Unity は .FBX.dae、.3DS.dxf および .obj__ ファイルを読み込めますが、プログラムでこれらの形式をエクスポートできるかどうかチェックしましょう。 一般的な 3D パッケージの FBX エクスポータは、here にあります。 パッケージの多くで、Collada エクスポータを使用できます。

ヒント

以下も併せて参照してください。

Page last updated: 2012-11-09



HOWTO-ImportObjectMaya

Unity はネイティブで Maya ファイルをインポートします。 始めるには、プロジェクトの Assets フォルダ内に.mbまたは.maファイルを置くだけです。 Unity に戻ると、シーンが自動的にインポートされ、プロジェクト ビューに表示されます。

Unity でモデルを確認するには、Project View から Scene View または Hierarchy View にオブジェクトをドラッグするだけです。

Unity は現在、Maya から以下のものをインポートします。

  1. 位置、回転、スケールのあるすべてのノード。 回転軸および名前もインポートされます。
  2. 頂点色のあるメッシュ、法線および 2 つまでの UV セット
  3. テクスチャとディフューズ色のあるマテリアル。 メッシュごとの複数のマテリアル。
  4. アニメーション FK & IK
  5. ボーンベースのアニメーション

Unity は、ブレンド形状はインポートしません。 代わりに、ボーンベースのアニメーションを使用します。 Unity はインポート時に、自動的にポリゴン メッシュを三角形にするため、Maya では手動でこの作業を行う必要はありません。

IK を使用して、キャラクターをアニメート化する場合、Project View でインポートした .mb ファイルを選択し、InspectorImport Settings ダイアログで「Bake IK & Simulation」を選択します。

要件

Maya .mb および .ma ファイルをインポートするには、Unity を使用しているマシンに Maya をインストールして、.mb/.ma ファイルをインポートします。 Maya 8.0 以降がサポートされています。

マシンに Maya をインストールせず、別のマシンから Maya ファイルをインポートしたい場合、Unity がネイティブにインポートできる [Main.HOWTO-exportFBX|to fbx format]] をエクスポートできます。 最高の結果を得るには、->2011.3 をインストールしてください。 エクスポートするには、HOWTO_exportFBX を参照してください。
エクスポートされたら、Unity の プロジェクト フォルダに fbx ファイルを置いてください。 Unity は、自動的に fbx ファイルをインポートします。 HOWTO_exportFBX で述べているように、インスペクタで FBX インポート設定をチェックします。

インポート プロセスの裏 (詳細)

Unity は Maya ファイルをインポートする際、バックグラウンドで Maya を起動します。 Unity が Maya と通信し、.mb ファイルを Unity が読み込める形式へと変換します。 Unity に最初に Maya ファイルをインポートする際、Maya をコマンド行プロセスで起動しますが、これには約 20 秒かかりますが、次のインポートは非常に高速で行われます。

トラブルシューティング

Page last updated: 2012-11-09



HOWTO-ImportObjectCinema4D

Unity はネイティブで Cinema 4D ファイルをインポートします。 始めるには、プロジェクトの Assets フォルダ内に.c4dファイルを置くだけです。 Unity に戻ると、シーンが自動的にインポートされ、Project View に表示されます。

Unity でモデルを確認するには、プロジェクト ビューから Scene View にオブジェクトをドラッグするだけです。

 .c4d ファイルを修正するには、Unityはファイル保存時に自動的に更新を行います。

Unity は現在、以下のものをインポートします。

  1. 位置、回転、スケールのあるすべてのオブジェクト。 回転軸および名前もインポートされます。
  2. UV および法線のあるメッシュ。
  3. テクスチャとディフューズ色のあるマテリアル。 メッシュごとの複数のマテリアル。
  4. アニメーション FK (IK は手動でベークする必要があります)。
  5. ボーンベースのアニメーション

Unity は、現在 Point Level Animations (PLA) をインポートしません。 代わりに、ボーンベースのアニメーションを使用します。

IK を使用したアニメート化キャラクター

IK を使用して、Cinema 4D でキャラクターをアニメート化する場合、Plugins->Mocca->Cappucino メニューを使用してエクスポートする前に IK をベークする必要があります。 Unity にインポートする前に、IK をベークしない場合、おそらくアニメート化ロケータを得るだけで、アニメート化ボーンを得ることはないでしょう。

要件

マシンに Cinema 4D をインストールせず、別のマシンから Cinema 4D ファイルをインポートしたい場合、Unity がネイティブにインポートできる FBX 形式をエクスポートできます。

  1. Cinema 4D ファイルを開きます。
  2. Cinema 4D で、File->Export->FBX 6.0 を選択します。
  3. Unity の プロジェクトの Assets フォルダに fbx ファイルを置いてください。 Unity は、自動的に fbx ファイルをインポートします。

ヒント

  1. Cinema 4D ファイルをインポートする際に、インポート速度を最大化するには、 Cinema 4D 設定 (Edit->Preferences) に移動し、FBX 6.0 設定を選択します。 「Embed Textures」のチェックを外します。

インポート プロセスの裏 (詳細)

Unity は Cinema 4D ファイルをインポートする際、Cinema 4D プラグインをインストールし、バックグラウンドで Cinema 4D を起動します。 Unity が Cinema 4D と通信し、.c4d ファイルを Unity が読み込める形式へと変換します。 最初に .c4d ファイルと Cinema 4D がまだ開いていない場合、起動に少し時間がかかりますが、その後.c4d ファイルは非常に高速でインポートされます。

Cinema 4D 10 サポート

.c4d ファイルを直接インポートする際、Unity はシーンの裏で、Cinema 4D にそのファイルを FBX に変換させます。 Cinema 4D 10.0 を出荷した時に、FBX エクスポータが激しく損傷しました。 Cinema 4D 10.1 では、これらの問題の多くは修正されました。 従って、Cinema 4D 10 を 10.1 にアップグレードすることを強くお勧めします。

Maxons FBX エクスポータには幾つかの問題が残っています。 現在は、Cinema 4D 10 で導入されたジョイントを使用するアニメート化キャラクターをエクスポートする信頼性の高い方法はないように見えます。しかし、9.6 で使用できる古いボーン システムは完全にエクスポートします。 従って、アニメート化キャラクターを作成する際には、ジョイントの代わりに、古いボーン システムを使用することが重要です。

Page last updated: 2012-11-09



HOWTO-ImportObjectMax

3dsMax で 3D オブジェクトを作成するには、Project に直接 .max ファイルを保存するか、Autodesk .FBX か一般的なフォーマットを使用して、Unity にこれらをエクスポートできます。

Unity は現在、3ds Max から以下のものをインポートします。Saving a Max file or exporting a generic 3D file type each has advantages and disadvantages see Mesh

  1. 位置、回転、スケールのあるすべてのノード。 回転軸および名前もインポートされます。
  2. 頂点色のあるメッシュ、法線および two UV sets
  3. ディフューズ色とテクスチャのあるマテリアル。 メッシュごとの複数のマテリアル。
  4. アニメーション
  5. Bone based animations.

3DS Max から FBX にエクスポートするには、

  1. Autodesk website から最新の fbx エクスポータをダウンロードし、インストールしてください。
  2. .fbx 形式で、シーンまたは選択したオブジェクトをエクスポートします (File->Export または File->Export Selected)。 デフォルトのエクスポートオプションの使用も大丈夫です。
  3. Unity のプロジェクト フォルダにエクスポートされた fbx ファイルをコピーします。
  4. Unity の戻す場合は、.fbxファイルは自動的にインポートされます。
  5. Project View から Scene View にファイルをドラッグします。

エクスポータ オプション

(基本的にすべてをエクスポートする) デフォルトの FBX エクスポータ オプションの選択が可能です。

Embed textures - this stores the image maps in the file, good for portability, not so good for file size

デフォルトの FBX エクスポータ オプション ( fbx プラグイン バージョン 2013.3 の場合)

# ボーンベースのアニメーションのエクスポート

これは、ボーンベースのアニメーションをエクスポートしたい時に従う手順です。

  1. 自由にボーン構造を設定します。
  2. FK および/または IK を使用して、好きなアニメーションを作成します。
  3. すべてのボーンおよび/または IK ソルバーを選択します。
  4. Motion->Trajectories に移動し、Collapse を押します。 Unity は、キー フィルタを作成するので、エクスポートするキーの数は関係ありません。
  5. 最も新しい FBX 形式として「Export」または「Export selected」を選択します。
  6. 通常通り、Assets に FBX ファイルをドロップします。
  7. Unity で、ルート ボーンにテクスチャをマテリアルにサイド割り当てる必要があります。

Unity に 3ds Max からメッシュやアニメーションのあるボーン階層をエクスポートする際、生成された GameObject 階層は、3ds Max の「Schematic view」で確認できる階層に対応しています。1 つ違うところは、Unity は、GameObject をアニメーションを含む、新しいルートとして配置し、メッシュやマテリアル情報をルート ボーンに置きます。

アニメーションおよびメッシュ情報を同じ Unity のGameObject に置きたい場合、3ds Max の階層ビューに移動し、メッシュ ノードをボーン階層のボーンにパレンディングします。

Lightmapping への 2 つの UV セットのエクスポート

3ds Max の Render To Texture および自動アンラッピング機能を使用して、ライト マップを作成できます。 Unity には組み込み lightmapperありますが、それがワークフローにより適する場合は、3dsmax を使用する方がよい場合があります。 通常、1 つの UV セットを、メイン テクスチャおよび/または法線マップに使用され、別の UV セットが、ライトマップ テクスチャに使用されます。 両方の UV セットが適切に動作するには、3ds Max のマテリアルは、標準である必要があり、デフューズ (メイン テクスチャの場合) およびセルフ イルミネーション (ライトマップの場合) の両方のマップ スロットを設定する必要があります。

「セルフ イルミネーション マップを使用した 3ds Max での Lightmapping に設定されたマテリアル」

オブジェクトが、マテリアル タイプの Shell を使用すると、現在の Autodesk ので FBX エクスポートは、UV を正しくエクスポートしません

または、以下に示すように、デフューズ マップのメイン テクスチャとライトマップを施用して、Multi/Sub Object マテリアル タイプを使用し、2 つのサブマテリアルを設定できます。 しかし。モデルの顔が異なるサブマテリアルの ID を使用する場合、これにより、複数のマテリアルがインポートされますが、パフォーマンスには最適ではありません。

「Multi/Sub Object オブジェクトを使用した 3ds Max での Lightmapping に設定された別のマテリアル」

トラブルシューティング

一部のモデルのインポートに問題がある場合は、 最新の FBX プラグイン(Autodesk website からインストールできます) がインストールされていることを確認し、それでもダメであれば、FBX 2012に戻してみて下さい。

Page last updated: 2012-11-09



HOWTO-ImportObjectCheetah3D

Unity はネイティブで Cheetah3D ファイルをインポートします。 始めるには、プロジェクトの Assets フォルダ内に.jas ファイルを置くだけです。 Unity に戻ると、シーンが自動的にインポートされ、Project View に表示されます。

Unity でモデルを確認するには、プロジェクト ビューから Scene View にオブジェクトをドラッグするだけです。

 .jas ファイルを修正するには、Unityはファイル保存時に自動的に更新を行います。

Unity は現在、Cheetah3D から以下のものをインポートします。

  1. 位置、回転、スケールのあるすべてのノード。 回転軸および名前もインポートされます。
  2. 頂点、ポリゴン、接線、UV および法線のあるメッシュ。
  3. アニメーション
  4. ディフューズ色とテクスチャのあるマテリアル。

要件

Page last updated: 2012-11-09



HOWTO-ImportObjectModo

Unity はネイティブで modo ファイルをインポートします。 これは、COLLADA エクスポータを使用して、内部で機能します。 Modo バージョン 501 以降でこの手法が使用されます。 始めるには、.lxo ファイルをプロジェクトの Assets フォルダに保存します。 Unity に戻ると、ファイルが自動的にインポートされ、プロジェクト ビューに表示されます。

501 以前の古いバージョンの modo の場合、Unity プロジェクト フォルダに Modo シーンを FBX または COLLADA ファイルとして保存します。 Unity に戻ると、シーンが自動的にインポートされ、Project View に表示されます。

Unity でモデルを確認するには、プロジェクト ビューから Scene View にオブジェクトをドラッグします。

lxo ファイルを修正するには、Unityはファイル保存時に自動的に更新を行います。

Unity は現在、以下のものをインポートします。

  1. 位置、回転、スケールのあるすべてのノード。 回転軸および名前もインポートされます。
  2. 頂点、法線および UV のあるメッシュ。
  3. テクスチャとディフューズ色のあるマテリアル。 メッシュごとの複数のマテリアル。
  4. アニメーション

要件

Page last updated: 2012-11-09



HOWTO-importObjectLightwave

Lightwave 向けの FBX プラグインを使用して、Lightwave からメッシュやアニメーションをインポートできます。

Unity は現在、以下のものをインポートします。

  1. 位置、回転、スケールのあるすべてのノード。 回転軸および名前もインポートされます。
  2. UV および法線のあるメッシュ。
  3. テクスチャとディフューズ色のあるマテリアル。 メッシュごとの複数のマテリアル。
  4. アニメーション
  5. ボーンベースのアニメーション

インストール

最新の Lightwave FBX エクスポータは以下からダウンロードできます。

これらのプラグインをダウンロードすることで、this licence に自動的に合意します。

プラグインには 2 つのバージョンがあり、1 つは、LightWave 8.0 用、1 つは LightWave 8.2 から 9.0 用になります。 Make sure you install the correct version.

Mac 向けのプラグインは、OS X パッケージと同梱しています。 このパッケージをダブルクリックして、インストールすると、インストーラが正しいフォルダにこれを置きます。 LightWave プラグイン フォルダが見つからない場合、Applications フォルダに自身の LightWave フォルダを作成し、そこでそのフォルダをダンプします。 後者が発生した場合、LightWave プラグイン フォルダ (またはサブ フォルダ) に移動する必要があります。 「Edit Plugins」パネル (Option-F11) を介して、LightWave にプラグインを追加する必要があります。プラグインの追加方法の詳細については、LightWave マニュアルを参照してください。

LightWave に追加されると、このプラグインには、一般メニュー タブ (ユーティリティ上) を介してアクセスできます。 一般メニューが表示されない場合、コンフィグ メニュー パネルを使用して、追加する必要があります。 後者のパネルで、プラグイン カテゴリ内で確認できます。このプラグインは「一般プラグイン」と呼ばれます。 便利なメニューに追加してください (この方法の詳細については、LightWave マニュアルを参照してください)。

インストールに関する詳細については、インストーラと共にダウンロードできるリリース ノートに記載されています。

エクスポート

All objects and animations have to be exported from Layout (there is no Modeler FBX exporter).

1. Select Export to FBX from the Generics menu

2. Select the appropriate settings in the fbx export dialog

3. Unity に切り替えます。

注意

Page last updated: 2012-11-09



HOWTO-ImportObjectBlender

Unity はネイティブで Blender ファイルをインポートします。 これは、バージョン 2.45 の Blender に追加された、 Blender FBX エクスポータを使用して、フードの下で機能します。 このため、Blender 2.45 以降を更新する必要があります (ただし、下記要件を参照)。

始めるには、.blendファイルをプロジェクトの Assets フォルダに保存します。 Unity に戻ると、ファイルが自動的にインポートされ、$Project View$$ に表示されます。

Unity でモデルを確認するには、プロジェクト ビューから Scene View にオブジェクトをドラッグします。

 .blend ファイルを修正するには、Unityはファイル保存時に自動的に更新を行います。

Unity は現在、以下のものをインポートします。

  1. 位置、回転、スケールのあるすべてのノード。 回転軸および名前もインポートされます。
  2. 頂点、ポリゴン、接線、UV および法線のあるメッシュ。
  3. ボーン
  4. スキン メッシュ
  5. アニメーション

要件

Page last updated: 2012-11-09



Workflow

Page last updated: 2012-11-09



HOWTO-MonoDevelop

Mono Develop は Unity 3.x に同梱されており、この IDE は、ゲームのスクリプティング部分およびそのデバッグを行うのを手伝います。

Mono Develop の設定

Unity と連携するよう、Mono Develop を設定するには、Unity 設定に移動し、デフォルトのエディタとして設定する必要があります。


Mono Develop をデフォルトのエディタとして設定

この後、既存のプロジェクトを作成または開いて、Assets -> Sync Mono Develop Project をクリックして、Mono Develop とプロジェクトを同期させます。

Mono Develop の同期

これにより、Mono Develop でプロジェクトが開きます (スクリプティング ファイルのみで、アセットは開きません)。 これで、debugging を開始する準備ができました。

プロジェクトの設定に問題がある場合、troubleshooting page にもアクセスしてみてください。

Page last updated: 2012-11-09



HOWTO-exportpackage

ゲームを作成する際、Unity はアセットに関する多くのメタデータを格納する (インポート設定、その他のアセットへのリンクなど)。 アセットを異なるプロジェクトに移したい場合、特定の方法があります。 次の方法で、この情報をすべて保ちつつ、プロジェクト間をアセットを簡単に移動できます。

  1. Project View で、エクスポートしたいすべてのアセット ファイルを選択します。
  2. メニューバーから Assets->Export Package... を選択します。
  3. パッケージに名前を付け、好きな場所に保存します。
  4. アセットを移したいプロジェクトを開きます。
  5. メニューバーから Assets->Import Package... を選択します。
  6. ステップ 3 で保存したパッケージ ファイルを選択します。

ヒント

Page last updated: 2012-11-09



HOWTO-InstallStandardAssets

Unity には、複数の Standard Assets パッケージが同梱されています。 これらは、大半の Unity ユーザーによって広く使用されるアセットの集合です。 プロジェクト ウィザードからプロジェクトを新規作成する際に、これらのアセットの集合を選択的に含めることができます。 これらのアセットは、Unity のインストール フォルダから、新しいプロジェクトにコピーされます。 つまり、Unity を新しいバージョンにアップグレードする場合、これらのアセットの新しいバージョンは得られないため、これらのアセットをアップグレードする必要があります。 また、効果などの新しいバージョンは、パフォーマンスまたは画質上の理由から、動作が異なる場合があるため、パラメータを再度微調整する必要があります。 ゲームの見かけや動作が突然異なって表示させたくない場合、アップグレード前にこれを検討することが重要です。 パッケージの内容および Unity のリリースノートを参照してください。

標準アセットには、一人称コントローラ、スカイボックス、レンズ フレア、Water prefabs, Image Effects などの便利なものが含まれます。


「プロジェクト新規作成時に一覧表示される標準アセット パッケージ」

アップグレード

例えば、Unity の新しいバージョンには、新しい標準アセットが同梱するために、標準アセットをアップグレードしたい場合があります。

  1. Assets->Import Package サブメニューからアップデートしたいパッケージを選択します。
  2. 新しいまたは交換したアセットのリストが表示されるので、Import をクリックします。

最もクリーンなアップグレードの場合、一部のスクリプト、効果またはプレハブが廃止される可能性があるか、不要になる可能性があり、Unity のパッケージには、(不要な) ファイルを削除する方法がないため、最初に古いパッケージの内容を削除することを検討する必要があります (使用できる古いバージョンの安全コピーを取っておいて下さい)。

Page last updated: 2012-11-09



HOWTO-PortingBetweenPlatforms

Most of Unity's API and project structure is identical for all supported platforms and in some cases a project can simply be rebuilt to run on different devices. However, fundamental differences in the hardware and deployment methods mean that some parts of a project may not port between platforms without change. Below are details of some common cross-platform issues and suggestions for solving them.

Input

The most obvious example of different behaviour between platforms is in the input methods offered by the hardware.

Keyboard and joypad

The Input.GetAxis function is very convenient on desktop platforms as a way of consolidating keyboard and joypad input. However, this function doesn't make sense for the mobile platforms which rely on touchscreen input. Likewise, the standard desktop keyboard input doesn't port over to mobiles well for anything other than typed text. It is worthwhile to add a layer of abstraction to your input code if you are considering porting to other platforms in the future. As a simple example, if you were making a driving game then you might create your own input class and wrap the Unity API calls in your own functions:-

// Returns values in the range -1.0 .. +1.0 (== left .. right).
function Steering() {
	return Input.GetAxis("Horizontal");
}


// Returns values in the range -1.0 .. +1.0 (== accel .. brake).
function Acceleration() {
	return Input.GetAxis("Vertical");
}


var currentGear: int;

// Returns an integer corresponding to the selected gear.
function Gears() {
	if (Input.GetKeyDown("p"))
		currentGear++;
	else if (Input.GetKeyDown("l"))
		currentGear--;

	return currentGear;
}

One advantage of wrapping the API calls in a class like this is that they are all concentrated in a single source file and are consequently easy to locate and replace. However, the more important idea is that you should design your input functions according to the logical meaning of the inputs in your game. This will help to isolate the rest of the game code from the specific method of input used with a particular platform. For example, the Gears function above could be modified so that the actual input comes from touches on the screen of a mobile device. Using an integer to represent the chosen gear works fine for all platforms, but mixing the platform-specific API calls with the rest of the code would cause problems. You may find it convenient to use platform dependent compilation to combine the different implementation of the input functions in the same source file and avoid manual swaps.

Touches and clicks

The Input.GetMouseButtonXXX functions are designed so that they have a reasonably obvious interpretation on mobile devices even though there is no "mouse" as such. A single touch on the screen is reported as a left click and the Input.mousePosition property gives the position of the touch as long as the finger is touching the screen. This means that games with simple mouse interaction can often work transparently between the desktop and mobile platforms. Naturally, though, the conversion is often much less straightforward than this. A desktop game can make use of more than one mouse button and a mobile game can detect multiple touches on the screen at a time.

As with API calls, the problem can be managed partly by representing input with logical values that are then used by the rest of the game code. For example, a pinch gesture to zoom on a mobile device might be replaced by a plus/minus keystroke on the desktop; the input function could simply return a float value specifying the zoom factor. Likewise, it might be possible to use a two-finger tap on a mobile to replace a right button click on the desktop. However, if the properties of the input device are an integral part of the game then it may not be possible to remodel them on a different platform. This may mean that game cannot be ported at all or that the input and/or gameplay need to be modified extensively.

Accelerometer, compass, gyroscope and GPS

These inputs derive from the mobility of handheld devices and so may not have any meaningful equivalent on the desktop. However, some use cases simply mirror standard game controls and can be ported quite easily. For example, a driving game might implement the steering control from the tilt of a mobile device (determined by the accelerometer). In cases like this, the input API calls are usually fairly easy to replace, so the accelerometer input might be replaced by keystrokes, say. However, it may be necessary to recalibrate inputs or even vary the difficulty of the game to take account of the different input method. Tilting a device is slower and eventually more strenuous than pressing keys and may also make it harder to concentrate on the display. This may result in the game's being more difficult to master on a mobile device and so it may be appropriate to slow down gameplay or allow more time per level. This will require the game code to be designed so that these factors can be adjusted easily.

Memory, storage and CPU performance

Mobile devices inevitably have less storage, memory and CPU power available than desktop machines and so a game may be difficult to port simply because its performance is not acceptable on lower powered hardware. Some resource issues can be managed but if you are pushing the limits of the hardware on the desktop then the game is probably not a good candidate for porting to a mobile platform.

Movie playback

Currently, mobile devices are highly reliant on hardware support for movie playback. The result is that playback options are limited and certainly don't give the flexibility that the MovieTexture asset offers on desktop platforms. Movies can be played back fullscreen on mobiles but there isn't any scope for using them to texture objects within the game (so it isn't possible to display a movie on a TV screen within the game, for example). In terms of portability, it is fine to use movies for introductions, cutscenes, instructions and other simple pieces of presentation. However, if movies need to be visible within the game world then you should consider whether the mobile playback options will be adequate.

Storage requirements

Video, audio and even textures can use a lot of storage space and you may need to bear this in mind if you want to port your game. Storage space (which often also corresponds to download time) is typically not an issue on desktop machines but this is not the case with mobiles. Furthermore, mobile app stores often impose a limit on the maximum size of a submitted product. It may require some planning to address these concerns during the development of your game. For example, you may need to provide cut-down versions of assets for mobiles in order to save space. Another possibility is that the game may need to be designed so that large assets can be downloaded on demand rather than being part of the initial download of the application.

Automatic memory management

The recovery of unused memory from "dead" objects is handled automatically by Unity and often happens imperceptibly on desktop machines. However, the lower memory and CPU power on mobile devices means that garbage collections can be more frequent and the time they take can impinge more heavily on performance (causing unwanted pauses in gameplay, etc). Even if the game runs in the available memory, it may still be necessary to optimise code to avoid garbage collection pauses. More information can be found on our memory management page.

CPU power

A game that runs well on a desktop machine may suffer from poor framerate on a mobile device simply because the mobile CPU struggles with the game's complexity. Extra attention may therefore need to be paid to code efficiency when a project is ported to a mobile platform. A number of simple steps to improve efficiency are outlined on this page in our manual.

Page last updated: 2012-05-31



MobileDeveloperChecklist

If you are having problems when developing for a mobile platform, this is a checklist to help you solve various problems.

Page last updated: 2012-10-10



MobileCrashes

Checklist for crashes

Editor.log - on the editor

The Debug messages, warnings and errors all go to the console. Also Unity prints status reports to the console loading assets, initializing mono, graphics driver info.

If you are trying to understand what is going on look at the editor.log. Here you will get the full picture, not just a console fragment. You can try to understand whats happening, and watch the full log of your coding session. This will help you track down what has caused Unity crash to crash or find out whats wrong with your assets.

Unity prints some tjings on the devices as well; Logcat console for android and Xcode gdb console on iOS devices

Android

Debugging on Android

  1. Use the DDMS or ADB tool
  2. Watch the stacktrace (Android 3 or newer). Either use c++filt (part of the ndk) or the other methods, like: http://slush.warosu.org/c++filtjs to decode the mangled function calls
  3. Look at the .so file that the crash occurs on:
    1. libunity.so - the crash is in the Unity code or the user code
    2. libdvm.so - the crash is in the Java world, somewhere with Dalvik. So find Dalviks stacktrace, look at your JNI code or anything Java-related (including your possible changes to the AndroidManifest.xml).
    3. libmono.so - either a Mono bug or you're doing something Mono strongly dislikes
  4. If the crashlog does not help you can disassemble it to get a rough understanding of what has happened.
    1. use ARM EABI tools from the Android NDK like this: objdump.exe -S libmono.so >> out.txt
    2. Look at the code around pc from the stacktrace.
    3. try to match that code within the fresh out.txt file.
    4. Scroll up to understand what is happening in the function it occurs in.

iOS

Debugging on iOS

  1. Xcode has built in tools. Xcode 4 has a really nice GUI for debugging crashes, Xcode 3 has less.
  2. Full gdb stack - thread apply all bt
  3. Enable soft-null-check:

Enable development build and script debugging. Now uncaught null ref exceptions will be printed to the Xcode console with the appropriate managed call stack.

  1. Try turning the "fast script call" and code stripping off. It may stop some random crashes, like those caused by using some rare .Net functions or reflection.

Strategy

  1. Try to figure out which script the crash happens in and debug it using mono develop on the device.
  2. If the crash seems to not be in your code, take a closer look at the stacktrace, there should be a hint of something happening. Take a copy and submit it, and well take a look.

Page last updated: 2012-10-10



MobileProfiling

Ports that the Unity profiler uses:

	MulticastPort : 54998
	ListenPorts : 55000 - 55511
	Multicast(unittests) : 55512 - 56023

They should be accessible from within the network node. That is, the devices that youre trying to profile on should be able to see these ports on the machine with the Unity Editor with the Profiler on.

First steps

Unity relies on the CPU (heavily optimized for the SIMD part of it, like SSE on x86 or NEON on ARM) for skinning, batching, physics, user scripts, particles, etc.

The GPU is used for shaders, drawcalls, image effects.

CPU or GPU bound

Pareto analysis

A large majority of problems (80%) are produced by a few key causes (20%).

  1. Use the Editor profiler to get the most problematic function calls and optimize them first.
  2. Make sure the scripts run only when necessary.
    1. Use OnBecameVisible/OnBecameInvisible to disable inactive objects.
    2. Use coroutines if you dont need some scripts to run every frame.
// Do some stuff every frame:
void Update () {
}

//Do some stuff every 0.2 seconds:
IEnumerator Start ()_ {
   while (true) {
      yield return new WaitForSeconds (0.2f);
   }
}
  1. Use the .NET System.Threading.Thread class to put heavy calculations to the other thread. This allows you to run on multiple cores, but Unity API is not thread-safe. So buffer inputs and results and read and assign them on the main thread.

CPU Profiling

Profile user code

Not all of the user code is shown in the Profiler. But you can use Profiler.BeginSample and Profiler.EndSample to make the required user code appear in the profiler.

GPU Profiling

The Unity Editor profiler cannot show GPU data as of now. Were working with hardware manufacturers to make it happen with the Tegra devices being the first to appear in the Editor profiler.

iOS

Tools for iOS

  • Unity internal profiler (not the Editor profiler). This shows the GPU time for the whole scene.
  • PowerVR PVRUniSCo shader analyzer. See below.
  • iOS: Xcode OpenGL ES Driver Instruments can show only high-level info:
    • Device Utilization % - GPU time spent on rendering in total. >95% means the app is GPU bound.
    • Renderer Utilization % - GPU time spent drawing pixels.
    • Tiler Utilization % - GPU time spent processing vertices.
    • Split count - the number of frame splits, where the vertex data didnt fit into allocated buffers.

PowerVR is tile based deferred renderer, so it’s impossible to get GPU timings per draw call. However you can get GPU times for the whole scene using Unitys built-in profiler (the one that prints results to Xcode output). Apples tools currently can only tell you how busy the GPU and its parts are, but do not give times in milliseconds.

PVRUniSCo gives cycles for the whole shader, and approximate cycles for each line in the shader code. Windows & Mac! But it wont match what Apples drivers are doing exactly anyway. Still, a good ballpark measure.

Android

Tools for Android

  • Adreno (Qualcomm)
  • NVPerfHUD (NVIDIA)
  • PVRTune, PVRUniSCo (PowerVR)

On Tegra, NVIDIA provides excellent performance tools which does everything you want - GPU time per draw call, Cycles per shader, Force 2x2 texture, Null view rectangle, runs on Windows, OSX, Linux. PerfHUD ES does not easily work with consumer devices, you need the development board from NVIDIA.

Qualcomm provides excellent Adreno Profiler (Windows only) which is Windows only, but works with consumer devices! It features Timeline graphs, frame capture, Frame debug, API calls, Shader analyzer, live editing.

Graphics related CPU profiling

The internal profiler gives a good overview per module:

Memory

Integrate this: http://docwiki.hq.unity3d.com/internal/index.php?n=Support.MemoryUsage

There is Unity memory and mono memory.

Mono memory

Mono memory handles script objects, wrappers for Unity objects (game objects, assets, components, etc). Garbage Collector cleans up when the allocation does not fit in the available memory or on a System.GC.Collect() call.

Memory is allocated in heap blocks. More can allocated if it cannot fit the data into the allocated block. Heap blocks will be kept in Mono until the app is closed. In other words, Mono does not release any memory used to the OS (Unity 3.x). Once you allocate a certain amount of memory, it is reserved for mono and not available for the OS. Even when you release it, it will become available internally for Mono only and not for the OS. The heap memory value in the Profiler will only increase, never decrease.

If the system cannot fit new data into the allocated heap block, the Mono calls a "GC" and can allocate a new heap block (for example, due to fragmentation).

Too many heap sections means youve run out of Mono memory (because of fragmentation or heavy usage).

Use System.GC.GetTotalMemory to get the total used Mono memory.

The general advice is, use as small an allocation as possible.

Unity memory

Unity memory handles Asset data (Textures, Meshes, Audio, Animation, etc), Game objects, Engine internals (Rendering, Particles, Physics, etc). Use Profiler.usedHeapSize to get the total used Unity memory.

Memory map

No tools yet but you can use the following.

Memory hiccups

class MyClass {
   public int a, b, c;
}

struct MyStruct {
   public int a, b, c;
}

void Update () {
   //BAD
   // allocated on the heap, will be garbage collected later!
   MyClass c = new MyClass();

   //GOOD
   //allocated on the stack, no GC going to happen!
   MyStruct s = new MyStruct();
}

Out of memory crashes

At some points a game may crash with "out of memory" though it in theory it should fit in fine. When this happens compare your normal game memory footprint and the allocated memory size when the crash happens. If the numbers are not similar, then there is a memory spike. This might be due to:

Page last updated: 2012-10-10



MobileOptimisation

Just like on PCs, mobile platforms like iOS and Android have devices of various levels of performance. You can easily find a phone thats 10x more powerful for rendering than some other phone. Quite easy way of scaling:

  1. Make sure it runs okay on baseline configuration
  2. Use more eye-candy on higher performing configurations:
    • Resolution
    • Post-processing
    • MSAA
    • Anisotropy
    • Shaders
    • Fx/particles density, on/off

Focus on GPUs

Graphics performance is bound by fillrate, pixel and geometric complexity (vertex count). All three of these can be reduced if you can find a way to cull more renderers. Occlusion culling and could help here. Unity will automatically cull objects outside the viewing frustum.

On mobiles youre essentially fillrate bound (fillrate = screen pixels * shader complexity * overdraw), and over-complex shaders is the most common cause of problems. So use mobile shaders that come with Unity or design your own but make them as simple as possible. If possible simplify your pixel shaders by moving code to vertex shader.

If reducing the Texture Quality in Quality Settings makes the game run faster, you are probably limited by memory bandwidth. So compress textures, use mipmaps, reduce texture size, etc.

LOD (Level of Detail) make objects simpler or eliminate them completely as they move further away. The main goal would be to reduce the number of draw calls.

Good practice

Mobile GPUs have huge constraints in how much heat they produce, how much power they use, and how large or noisy they can be. So compared to the desktop parts, mobile GPUs have way less bandwidth, low ALU performance and texturing power. The architectures of the GPUs are also tuned to use as little bandwidth & power as possible.

Unity is optimized for OpenGL ES 2.0, it uses GLSL ES (similar to HLSL) shading language. Built in shaders are most often written in HLSL (also known as Cg). This is cross compiled into GLSL ES for mobile platforms. You can also write GLSL directly if you want to, but doing that limits you to OpenGL-like platforms (e.g. mobile + Mac) since there currently are no GLSL->HLSL translation tools. When you use float/half/fixed types in HLSL, they end up highp/mediump/lowp precision qualifiers in GLSL ES.

Here is the checklist for good practice:

  1. Keep the number of materials as low as possible. This makes it easier for Unity to batch stuff.
  2. Use texture atlases (large images containing a collection of sub-images) instead of a number of individual textures. These are faster to load, have fewer state switches, and are batching friendly.
  3. Use Renderer.sharedMaterial instead of Renderer.material if using texture atlases and shared materials.
  4. Forward rendered pixel lights are expensive.
    • Use light mapping instead of realtime lights where ever possible.
    • Adjust pixel light count in quality settings. Essentially only the directional light should be per pixel, everything else - per vertex. Certainly this depends on the game.
  5. Experiment with Render Mode of Lights in the Quality Settings to get the correct priority.
  6. Avoid Cutout (alpha test) shaders unless really necessary.
  7. Keep Transparent (alpha blend) screen coverage to a minimum.
  8. Try to avoid situations where multiple lights illuminate any given object.
  9. Try to reduce the overall number of shader passes (Shadows, pixel lights, reflections).
  10. Rendering order is critical. In general case:
    1. fully opaque objects roughly front-to-back.
    2. alpha tested objects roughly front-to-back.
    3. skybox.
    4. alpha blended objects (back to front if needed).
  11. Post Processing is expensive on mobiles, use with care.
  12. Particles: reduce overdraw, use the simplest possible shaders.
  13. Double buffer for Meshes modified every frame:
void Update (){
  // flip between meshes
  bufferMesh = on ? meshA : meshB;
  on = !on;
  bufferMesh.vertices = vertices; // modification to mesh
  meshFilter.sharedMesh = bufferMesh;
}

Sharer optimizations

Checking if you are fillrate-bound is easy: does the game run faster if you decrease the display resolution? If yes, you are limited by fillrate.

Try reducing shader complexity by the following methods:

Focus on CPUs

It is often the case that games are limited by the GPU on pixel processing. So they end up having unused CPU power, especially on multicore mobile CPUs. So it is often sensible to pull some work off the GPU and put it onto the CPU instead (Unity does all of these): mesh skinning, batching of small objects, particle geometry updates.

These should be used with care, not blindly. If you are not bound by draw calls, then batching is actually worse for performance, as it makes culling less efficient and makes more objects affected by lights!

Good practice

Physics

Physics can be CPU heavy. It can be profiled via the Editor profiler. If Physics appears to take too much time on CPU:

Android

GPU

These are the popular mobile architectures. This is both different hardware vendors than in PC/console space, and very different GPU architectures than the usual GPUs.

  • ImgTec PowerVR SGX - Tile based, deferred: render everything in small tiles (as 16x16), shade only visible pixels
  • NVIDIA Tegra - Classic: Render everything
  • Qualcomm Adreno - Tiled: Render everything in tile, engineered in large tiles (as 256k). Adreno 3xx can switch to traditional.
  • ARM Mali Tiled: Render everything in tile, engineered in small tiles (as 16x16)

Spend some time looking into different rendering approaches and design your game accordingly. Pay especial attention to sorting. Define the lowest end supported devices early in the dev cycle. Test on them with the profiler on as you design your game.

Use platform specific texture compression.

Further reading

Screen resolution

Android version

iOS

GPU

Only PowerVR architecture (tile based deferred) to be concerned about.

  • ImgTec PowerVR SGX. Tile based, deferred: render everything in tiles, shade only visible pixels
  • ImgTec .PowerVR MBX. Tile based, deferred, fixed function - pre iPhone 4/iPad 1 devices

This means:

  • Mipmaps are not so necessary.
  • Antialiasing and aniso are cheap enough, not needed on iPad 3 in some cases

And cons:

  • If vertex data per frame (number of vertices * storage required after vertex shader) exceeds the internal buffers allocated by the driver, the scene has to be split which costs performance. The driver might allocate a larger buffer after this point, or you might need to reduce your vertex count. This becomes apparent on iPad2 (iOS 4.3) at around 100 thousand vertices with quite complex shaders.
  • TBDR needs more transistors allocated for the tiling and deferred parts, leaving conceptually less transistors for raw performance. Its very hard (i.e. practically impossible) to get GPU timing for a draw call on TBDR, making profiling hard.

Further reading

Screen resolution

iOS version

Dynamic Objects

Asset Bundles

Is there any limitation for download numbers of Assetbundle at the same time on iOS? (e.g Can we download over 10 assetbundles safely at the same time(or every frame)? )

Downloads are implemented via async API provided by OS, so OS decides how many threads need to be created for downloads. When launching multiple concurrent downloads you should keep in mind total device bandwidth it can support and amount of free memory. Each concurrent download allocates its own temporal buffer, so you should be careful there to not run out of memory.

Resources

Silly issues checklist

Sometimes theres nothing in the console, just a random crash

Page last updated: 2012-10-10



Advanced

Page last updated: 2007-11-16



Vector Cookbook

Vector Cookbook

Although vector operations are easy to easy to describe, they are surprisingly subtle and powerful and have many uses in games programming. The following pages offer some suggestions about using vectors effectively in your code.

Page last updated: 2011-08-26



UnderstandingVectorArithmetic

Vector arithmetic is fundamental to 3D graphics, physics and animation and it is useful to understand it in depth to get the most out of Unity. Below are descriptions of the main operations and some suggestions about the many things they can be used for.

Addition

When two vectors are added together, the result is equivalent to taking the original vectors as "steps", one after the other. Note that the order of the two parameters doesn't matter, since the result is the same either way.

If the first vector is taken as a point in space then the second can be interpreted as an offset or "jump" from that position. For example, to find a point 5 units above a location on the ground, you could use the following calculation:-

var pointInAir = pointOnGround + new Vector3(0, 5, 0);

If the vectors represent forces then it is more intuitive to think of them in terms of their direction and magnitude (the magnitude indicates the size of the force). Adding two force vectors results in a new vector equivalent to the combination of the forces. This concept is often useful when applying forces with several separate components acting at once (eg, a rocket being propelled forward may also be affected by a crosswind).

Subtraction

Vector subtraction is most often used to get the direction and distance from one object to another. Note that the order of the two parameters does matter with subtraction:-

// The vector d has the same magnitude as c but points in the opposite direction.
var c = b - a;
var d = a - b;

As with numbers, adding the negative of a vector is the same as subtracting the positive.

// These both give the same result.
var c = a - b;
var c = a + -b;

The negative of a vector has the same magnitude as the original and points along the same line but in the exact opposite direction.

Scalar Multiplication and Division

When discussing vectors, it is common to refer to an ordinary number (eg, a float value) as a scalar. The meaning of this is that a scalar only has "scale" or magnitude whereas a vector has both magnitude and direction.

Multiplying a vector by a scalar results in a vector that points in the same direction as the original. However, the new vector's magnitude is equal to the original magnitude multiplied by the scalar value.

Likewise, scalar division divides the original vector's magnitude by the scalar.

These operations are useful when the vector represents a movement offset or a force. They allow you to change the magnitude of the vector without affecting its direction.

When any vector is divided by its own magnitude, the result is a vector with a magnitude of 1, which is known as a normalized vector. If a normalized vector is multiplied by a scalar then the magnitude of the result will be equal to that scalar value. This is useful when the direction of a force is constant but the strength is controllable (eg, the force from a car's wheel always pushes forwards but the power is controlled by the driver).

Dot Product

The dot product takes two vectors and returns a scalar. This scalar is equal to the magnitudes of the two vectors multiplied together and the result multiplied by the cosine of the angle between the vectors. When both vectors are normalized, the cosine essentially states how far the first vector extends in the second's direction (or vice-versa - the order of the parameters doesn't matter).

It is easy enough to think in terms of angles and then find the corresponding cosines using a calculator. However, it is useful to get an intuitive understanding of some of the main cosine values as shown in the diagram below:-

The dot product is a very simple operation that can be used in place of the Mathf.Cos function or the vector magnitude operation in some circumstances (it doesn't do exactly the same thing but sometimes the effect is equivalent). However, calculating the dot product function takes much less CPU time and so it can be a valuable optimization.

Cross Product

The other operations are defined for 2D and 3D vectors and indeed vectors with any number of dimensions. The cross product, by contrast, is only meaningful for 3D vectors. It takes two vectors as input and returns another vector as its result.

The result vector is perpendicular to the two input vectors. The "left hand rule" can be used to remember the direction of the output vector from the ordering of the input vectors. If the first parameter is matched up to the thumb of the hand and the second parameter to the forefinger, then the result will point in the direction of the middle finger. If the order of the parameters is reversed then the resulting vector will point in the exact opposite direction but will have the same magnitude.

The magnitude of the result is equal to the magnitudes of the input vectors multiplied together and then that value multiplied by the sine of the angle between them. Some useful values of the sine function are shown below:-

The cross product can seem complicated since it combines several useful pieces of information in its return value. However, like the dot product, it is very efficient mathematically and can be used to optimize code that would otherwise depend on slow transcendental functions.

Page last updated: 2011-08-26



DirectionDistanceFromOneObjectToAnother

If one point in space is subtracted from another then the result is a vector that "points" from one object to the other:

// Gets a vector that points from the player's position to the target's.
var heading = target.position - player.position;

As well as pointing in the direction of the target object, this vector's magnitude is equal to the distance between the two positions. It is common to need a normalized vector giving the direction to the target and also the distance to the target (say for directing a projectile). The distance between the objects is equal to the magnitude of the heading vector and this vector can be normalized by dividing it by its magnitude:-

var distance = heading.magnitude;
var direction = heading / distance;  // This is now the normalized direction.

This approach is preferable to using the both the magnitude and normalized properties separately, since they are both quite CPU-hungry (they both involve calculating a square root).

If you only need to use the distance for comparison (for a proximity check, say) then you can avoid the magnitude calculation altogether. The sqrMagnitude property gives the square of the magnitude value, and is calculated like the magnitude but without the time-consuming square root operation. Rather than compare the magnitude against a known distance, you can compare the squared magnitude against the squared distance:-

if (heading.sqrMagnitude < maxRange * maxRange) {
	// Target is within range.
}

This is much more efficient than using the true magnitude in the comparison.

Sometimes, the overground heading to a target is required. For example, imagine a player standing on the ground who needs to approach a target floating in the air. If you subtract the player's position from the target's then the resulting vector will point upwards towards the target. This is not suitable for orienting the player's transform since he will also point upwards; what is really needed is a vector from the player's position to the position on the ground directly below the target. This is easily obtained by taking the result of the subtraction and setting the Y coordinate to zero:-

var heading = target.position - player.position;
heading.y = 0;	// This is the overground heading.

Page last updated: 2011-08-26



ComputingNormalPerpendicularVector

A normal vector (ie, a vector perpendicular to a plane) is required frequently during mesh generation and may also be useful in path following and other situations. Given three points in the plane, say the corner points of a mesh triangle, it is easy to find the normal. Pick any of the three points and then subtract it from each of the two other points separately to give two vectors:-

var a: Vector3;
var b: Vector3;
var c: Vector3;

var side1: Vector3 = b - a;
var side2: Vector3 = c - a;

The cross product of these two vectors will give a third vector which is perpendicular to the surface. The "left hand rule" can be used to decide the order in which the two vectors should be passed to the cross product function. As you look down at the top side of the surface (from which the normal will point outwards) the first vector should sweep around clockwise to the second:-

var perp: Vector3 = Vector3.Cross(side1, side2);

The result will point in exactly the opposite direction if the order of the input vectors is reversed.

For meshes, the normal vector must also be normalized. This can be done with the normalized property, but there is another trick which is occasionally useful. You can also normalize the perpendicular vector by dividing it by its magnitude:-

var perpLength = perp.magnitude;
perp /= perpLength;

It turns out that the area of the triangle is equal to perpLength / 2. This is useful if you need to find the surface area of the whole mesh or want to choose triangles randomly with probability based on their relative areas.

Page last updated: 2011-08-26



AmountVectorMagnitudeInAnotherDirection

A car's speedometer typically works by measuring the rotational speed of one of the unpowered wheels. The car may not be moving directly forward (it may be skidding sideways, for example) in which case part of the motion will not be in the direction the speedometer can measure. The magnitude of an object's rigidbody.velocity vector will give the speed in its direction of overall motion but to isolate the speed in the forward direction, you should use the dot product:-

var fwdSpeed = Vector3.Dot(rigidbody.velocity, transform.forward);

Naturally, the direction can be anything you like but the direction vector must always be normalized for this calculation. Not only is the result more correct than the magnitude of the velocity, it also avoids the slow square root operation involved in finding the magnitude.

Page last updated: 2011-08-26



AssetBundles

AssetBundles are files which you can export from Unity to contain assets of your choice. These files use a proprietary compressed format and can be loaded on demand by your application. This allows you to stream in content, such as models, textures, audio clips, or even entire scenes separately from the scene in which they will be used. AssetBundles have been designed to simplify downloading content to your application. AssetBundles can contain any kind of asset type recognized by Unity, as determined by the filename extension. If you want to include files with custom binary data, they should have the extension ".bytes". Unity will import these files as TextAssets.

When working with AssetBundles, here's the typical workflow:

During development, the developer prepares AssetBundles and uploads them to a server.

                       Building and uploading asset bundles
  1. Building AssetBundles. Asset bundles are created in the editor from assets in your scene. The Asset Bundle building process is described in more detail in the section for Building AssetBundles
  2. Uploading AssetBundles to external storage. This step does not include the Unity Editor or any other Unity channels, but we include it for completeness. You can use an FTP client to upload your Asset Bundles to the server of your choice.

At runtime, on the user's machine, the application will load AssetBundles on demand and operate individual assets within each AssetBundle as needed.

                       Downloading AssetBundles and loading assets from them
  1. Downloading AssetBundles at runtime from your application. This is done from script within a Unity scene, and Asset Bundles are loaded from the server on demand. More on that in Downloading Asset Bundles.
  2. Loading objects from AssetBundles. Once the AssetBundle is downloaded, you might want to access its individual Assets from the Bundle. More on that in Loading Resources from AssetBundles

See also:

Page last updated: 2012-10-11



Frequently Asked Questions

  1. What are AssetBundles?
  2. What are they used for?
  3. How do I create an AssetBundle?
  4. How do I use an AssetBundle?
  5. How do I use AssetBundles in the Editor?
  6. How do I cache AssetBundles?
  7. Are AssetBundles cross-platform?
  8. How are assets in AssetBundles identified
  9. Can I reuse my AssetBundles in another game?
  10. Will an AssetBundle built now be usable with future versions of Unity?
  11. How can I list the objects in an AssetBundle?

  1. What are AssetBundles?

AssetBundles are a collection of assets, packaged for loading at runtime. With Asset Bundles, you can dynamically load and unload new content into your application. AssetBundles can be used to implement post-release DLC.

  1. What are they used for?

They can be used to reduce the amount of space on disk used by your game, when first deployed. It can also be used to add new content to an already published game.

  1. How do I create an AssetBundle?

To create an AssetBundle you need to use the BuildPipeline editor class. All scripts using Editor classes must be placed in a folder named Editor, anywhere in the Assets folder. Here is an example of such a script in C#:

+ Show [Creating an AssetBundle] +